Thursday, November 13, 2008

InfoPath in SSIS

Importing XML files generated by InfoPath has some nuances:

The article SSIS: Using InfoPath XML Files in SSIS XMLSource Adapter at SQLJunkies.com by 'ashvinis' which covers them pretty well.

Basically, to make the XML file available in such a way it can be mapped to a relational source of some type, XML Tasks are needed to:
1. Remove extra namespaces

2. Wrap the InfoPath fields in a Table/Fields Schema

(The article points out that a 'default' InfoPath form will only contain a 'Fields' element. i.e.
<?xml version="1.0"
encoding="UTF-8"?>
<?mso-infoPathSolution solutionVersion="1.0.0.2"
productVersion="11.0.6357"
PIVersion="1.0.0.0"
href="file:///c:\infofile.xsn"
name="urn:schemas-microsoft-com:office:infopath:Info1:-myXSD-2005-04-27T19-26-55"
?>
<?mso-application
progid="InfoPath.Document"?>
<my:myFields
xmlns:my="http://schemas.microsoft.com/office/infopath/2003/myXSD/2005-04-27T19:26:55"
>
<my:FirstName>Wenyang</my:FirstName>
<my:LastName>Hu</my:LastName>
<my:PhoneNumber>425-123-4567</my:PhoneNumber>
</my:myFields>

I've written a small XSLT file, that will wrap this default 'myFields' element in a 'myTable' element:
<?xml version="1.0" encoding="utf-8"?>
<xsl:stylesheet version="1.0"
xmlns:xsl="http://www.w3.org/1999/XSL/Transform">

<xsl:output method="xml" encoding="utf-8"/>
<xsl:template match="*/comment()processing-instruction()">
<xsl:copy>
<xsl:apply-templates />
</xsl:copy>
</xsl:template>
<xsl:template match="myFields">
<myTable>
<xsl:copy>
<xsl:apply-templates />
</xsl:copy>
</myTable>
</xsl:template>
</xsl:stylesheet>

This can be layered with the steps in the SQLJunkies article to provide a fully automated prep of InfoPath XML, which can then be used in imports to relational databases such as SQL Server.

Friday, October 24, 2008

SQL Server Express Edition

Found a good article on MSDN - comparing the various downloads of SQL Server Express, entitled 'What's up with SQL Server 2008 Express editions'.

I just completed setting up a VPC with SQL Server 2008 Express as a demo environment for an article I'm drafting. The 2008 version, at least the W/ Advanced Services is a 512 MB download (SQL Server 2005 Express Edition, w Advanced Services was 234 MB) I haven't been able to find a source on this, but it appears that the 2008 version ships with the entire codebase to install any edition of SQL Server. The product key you enter then limits what edition (Express, Standard, Enterprise, etc.) is available to you.

Take note of the prequisites:
  • .NET Framework 3.5 SP1
  • Windows Installer 4.5
  • Windows PowerShell 1.0

The install will not run without the first two, but you'll only find that out after the installation files are extracted (which was a bit sluggish on my VPC), and then both required a reboot. The install will run and then warn of PowerShell missing, which can be enabled without a reboot, then the install of SQL Server can continue.

Friday, October 10, 2008

New Open Source Project - File Server Documenter

A new open source project has been started at CodePlex for the File Server Documenter.
From the project's home page:
File Server Documenter contains routines and a basic interface to scan a file server and produce raw documentation. Oriented towards a customer/technical support person rather than the server administrator. It can be considered an auditing tool, or just providing general information about what the file server is being used for.
The first release is really some very tactical tasks I've had over the past few years - starting with a single file console app that scanned a file server for Access databases (the original exe was called MDBDocumenter, in fact)

Wednesday, October 8, 2008

A great tool for mounting Disk Image (*.iso) files

A great tool for mounting Disk Image (*.iso) files so they can be accessed through the Windows file system is Virtual Clone Drive from SlySoft. While not specifically related to Virtual PC, I’ve found that as I’ve used more virtualization, it’s more common to store or move software as
ISO’s vs. ever burning to a physical CD or DVD. In a situation when you need to access those on a host (non-Virtual PC) computer, the traditional way was to burn the ISO to a CD, then run software from a CD. Virtual Clone Drive allows the ISO to be mounted.

The console is very minimal, allowing you to create up to 8 ‘drives’. Each of these shows up as a new drive letter in Windows.



On each new drive letter, simply right-click:



Choose Virtual Clone Drive ->‘Mount…’ and browse to your ISO


Plus, the drive letter has a wacky sheep icon as a conversation starter… what’s not to love?

Wednesday, October 1, 2008

SQL Server Integration Services – Beyond the Wizard – Part 1

In a method not entirely unusual for me, and I'm sure many of you, I'm picking up a technology 'on-the-fly'. SQL Server Integration Services (SSIS) 2005. Since I have a fairly solid understanding of Extract-Transform-Load and other integration topics, and experience with the predecessor tool - Data Transformation Services (DTS), most of the learning curve has been the nuances of the tool.

I'm finding I'm learning it in essentially a 'refactoring' approach. I've been able to get done the work I need to, but not always in the most elegant way.

This blog series will highlight some topics that will be of interest to the intermediate developer - You've been able to build a basic data flow, import/export, or other common routine, possibly only what the import/export wizard out of SQL Server Management Studio generates. Perhaps a number of them following a similar pattern. The next task is to polish it off to make it more dynamic and maintainable on a long-term basis.

One of the first issues I noticed with my SSIS project was that the connection strings were stored in each package - This is what you will get if the results of the Import/Export Wizard are saved as an SSIS package. This export shows a routine import of a CSV file into an AdventureWorks database on the local SQL Server.




I didn't see an obvious way to move the 'Connection Managers' from the package to the project level, the actual connection string property of a Connection Manager seemed like the next bet, but couldn't find a way to dynamic-ize this. While I had seen the Data Sources container under the project - I couldn't figure out how to reference these inside a package.

To change an SSIS package from using a per-package 'Connection Manager' to using a per-project 'Data Source' (useful if many SSIS packages use the same database) involves:


  • Creating a new project-level data source

  • Associating each Control Flow and Data Flow Task that uses the package's Connection Manager with the package source

  • Deleting the package-level connection manager

1. Create a Project-Level Data Source


a. Right-Click the Data Sources folder under the project

b. Choose 'New Data Source'



c. Click 'Next ->'



d. Select the appropriate Server, Database, and Authentication



e. Choose 'OK'

2. Associate each Control Flow and Data Flow Task that uses the Package's Connection Manager


a. Where it now shows 'DestinationConnectionOLEDB' (the default name)



b. Choose the drop-down -> New Connection…



c. Select the data source created in Step 1 above



d. Repeat for any task objects that reference 'DestinationConnectionOLEDB'

3. Delete the Connection Manager


To tidy things up, from the connections managers tab, delete 'DestinationConnectionOLEDB'



Summary



These steps demonstrated converting from a per-package connection manager to a data source. The Data Source can now be referenced in other packages, providing a single home for the Server Name, Database Name, and Authentication.


In this case, only the target data source - AdventureWorks was created. The source file could have been created as a data source as well, but there is less value if one side or the other of the integration is relatively unique to that package. Where I was doing multiple connections between the same databases, it would make sense to create both as Data Sources.

Sunday, September 7, 2008

Learning DotNet

As an experienced developer, colleagues looking to get into the field often ask me how I "learned to program". I did take formal classes at junior college and university, and to some extent my experience was sort of vaguely finding my path into jobs that required me to pick up Access, and later Visual Basic, FoxPro, and start using it.

The other area that really filled in the gaps between theory and trial and error for me was an online education site called 'ZDU'. The site isn't around anymore, or at least not in the form it was then, but one of the authors/instructors, John Smiley, that taught in that format now accompanies his own books with internet classes through his website, http://www.johnsmiley.com.

I learned from him back on VB 5, then VB 6 (I still have a number of his spiral bound ZDU 'workbooks' on the shelf) Reading the abstracts of his current books, of course updated for .net 2.0, 3.0, etc. and expanded to VB.net and C#, it appears to follow a similar format that I highly recommend. Mr. Smiley takes you from the beginning of a fictional project, that gets built out chapter-by-chapter to a finished product at the end. I really like the format as it tends to follow how I develop applications in the real world. I highly recommend his books, and his internet classes, to someone looking to learn to program and wants to focus on the Microsoft Dot Net product.

Take a look at his classes or books

Thursday, August 28, 2008

SQL Server Agent musing - T-SQL code in the job steps

It turns out that the SQL Server Agent does not take part in sysdepends. This allows the SQL Server to warn if an object is misspelled: i.e.


--create a procedure that calls an object that doesn't exist
CREATE PROCEDURE tmpBlah AS
--execute a procedure that doesn't exist
EXEC dbo.blah2
GO
--clean up the procedure
DROP PROCEDURE tmpBlah



Will return:



Cannot add rows to sys.sql_dependencies for the stored procedure because it depends on the missing table 'dbo.blah2'. The stored procedure will still be created; however, it cannot be successfully executed until the table exists.



Creating a SQL Server Agent job step and setting the command text to:



EXEC dbo.blah2



No warning. When the job runs, it will fail on that step withOne has to wait for the job history in order to see the error:



Could not find stored procedure 'dbo.blah2'



This reinforces a standard I've used of storing most T-SQL code that would otherwise be scheduled in a stored procedure within the target database.

In a recent project I'm working, a large amount of the data caching and aggregation code was stored in the SQL Server Agent job steps. I've established a development standard of storing most code inside the target database, and limited Agent job steps to:


  • SQL Server Integration Services Package

  • Stored procedure – ideally within the target database

  • Limited small T-SQL commands such as truncating tables, creating cached copies (as soon as the commands are more than a few lines, or are very domain-specific, they will be moved to a stored procedure)

  • Maintenance tasks (such as DBCC DBREINDEX(@TableName,' ',90)

Wednesday, August 27, 2008

Virtual PC Resources

Resource for getting started with Virtual PC

Because I just introduced Virtual PC to a few new colleagues at work, I've collected here various resources used in the last year or two in my use of Virtual PC, from emails, favorites, etc.


Microsoft Virtual PC 2007 Technical Reference

A good starting point for setting up your first Virtual PC, using a established, supported OS such as Windows 98, 2000 or XP


TechNet Webcast: Microsoft Virtual PC Technical Overview - Level 200

I’ve previously used this webcast to introduce Virtual PC to teams new to the product, It’s a 2004 version, but other than slight differences in look-and-feel in 2007, as an end-user, the features hold true.


Virtual PC Guy

Good blog about issues with Virtual PC, setting up uncommon scenarios, alternative OS's, etc.


Specific Platforms

Microsoft VHD Downloads
Microsoft makes available a number of platforms as pre-build virtual hard drives, typically as an time-limited evaluation, but in some situations, I've been able to then authorize them with the MSDN product keys when using for dev or test purposes. The Server 2003/2008 VHDs especially are great for starting off an eval of a server product, rather than stepping through the OS install...every...time.

Installing a New Microsoft Office SharePoint Server 2007 Portal: Step-by-Step Instructions
Not specifically addressing Virtual PC installations, but a common task for within a Virtual PC/Server (Download the evaluation edition)

Installing Microsoft Virtual PC and a DOS Virtual Machine
Haven't came up with an actual useful need for the DOS VM yet, but just having it and being able to boot it quickly drives the point home about being able to use Virtual PC for legacy support.

Linux
Installing Ubuntu 8.04 under Microsoft Virtual PC 2007
This was the most painless Linux distribution to install from scratch on a Virtual PC. But because if you're far from a master of the Linux, but need a test bed for some web apps - the LAMP Virtual Appliance was a great out of the box solution. This is a prebuilt Virtual Hard Drive you can add as Virtual PC, and is preconfigured with a very light-weight Linux no GUI, Apache web server, mySQL database, and the PHP, Perl, and Python engines. (To fully access it from your host relies on some reverse DNS configurations I didn't know, but it for my purposes, was easy enough to alias the IP address and use that)

Wednesday, June 18, 2008

SQL Server 2008 Launch

Attended Microsoft's 2008 launch event - Heroes Happen Here - in Sacramento two weeks ago. At the Sacramento Convention Center on J St.

In the morning, I was in the SQL Server session, collateral at (http://sqlserver2008jumpstart.microsofttraining.com/content/info.asp?CcpSubsiteID=69&infoid=27) Most of it was fairly DBA-side heavy. The first session was on Infrastructural improvements, it wasn't very demo-friendly, a lot of security features, basically expanded on 2005 in areas such as encryption and securing functions. A new feature called 'Transparent Data Encryption', allowed the whole database to be encrypted, and, as the title describes, the developer does not need to change any of their code to work with it... Similar impact on the developer as using EFS to encrypt the file system, but there seems to be minimal performance hit and less overhead, plus the keys can be centrally managed, backed up, etc. where EFS this is difficult / impossible? Other interesting factoids that were sort of mentioned as asides, I need to research them a bit more, was the ability to expose the file system, and XML files or sources as data objects, and concept called 'Sync Services' (which may have been around in 2005, but less prominent)

The second presentation introduced an really helpful feature for multi-server management called Declarative Management Framework (DMF) for policy-based management. Basically, most settings, or rule you could manually apply to one server, could easily be applied to many servers. (Much further than the 'Master Server' concept in SQL 2000-2005 that applied scripts to multiple servers by kicking off SQL Server Agent on each target) I saw it as basically three dimensions:
  • How to enforce the policy (log it only, force the change (possibly 'breaking' the other server?)
  • When to enforce it (on demand, scheduled, when changes occurred)
  • On what servers (By version, by server OS, etc. - similar pattern as group policy, a variety of custom filters could be built up to apply the change to only certain dbs, or servers) For that matter, servers could be grouped in some custom manner too.

In some of the environments, I've worked in with a large amount of MSDE or SQL Express instances, this is a {deity}-send. I'm not completely sure how it works against occassionally-connected boxes, on demand would not to be an option, but scheduled or on change would, assuming the result of applying the policy placed some logic on each remote server... but if only SQL 2008 contains the logic (it was mentioned that past version of SQL are supported, but as remote servers... similar to how SQL 2000 boxes can be administered with SSMS)

Anywhom, using the 'log-only' feature, with the data captured, my idea was to create some model of a 'compliance score' off it, for example, a 'grade' each server would get for compliancy with some standard. Some policies could be weighted differently, i.e. setting backup options may be higher weight than stored proc naming conventions... Will have to get a SQL 2008 box set up to take a look.

The last session of the SQL Server morning was new features of Analysis Services. The features were largely incremental, since I haven't spent much time in this tool yet, I quickly surmised that I needed to work with it from the ground up, vs. learning what changed in 2008 vs. 2005. A brief demo of some changes in SSRS showed a tool called 'Report Designer'. It was definitely new, but it seemed to fill the same space as 'Report Builder', and unless I misheard, it was explicitly mentioned that both existed.

The afternoon session was on Visual Studio. I don't have as many notes on that, probably a creature of zoning out in the afternoon, and most of the features were very incremental (and I actually recall seeing a very similar demo on VSTO last year)

Christopher Schipper demo'd various ASP.net features -nested master pages, more CSS design and 'debugging', javascript debugging, LINQ to XML. A couple of features that stood out for me was a Sync feature. To the project, simply clicking 'Add->Local Database Cache' allowed the creation of an offline copy of a data source to be shipped with the app. 1 line of code - cacheRefTablesSyncAgent.cache could be called to update the cache. 1 way synchronization is easy. 2-way is possible, but requires more code to handle events related to conflict management.

There was a demo of VSTO, creating a form that would be part of Outlook, could be used to consume/respond to an email.

Saturday, June 14, 2008

Project Management Professional (PMP)

In May 2008, I became a certified Project Management Professional (PMP), a credential through PMI. I've been interested almost 3 years, joining PMI when I still lived in San Diego, but about a year or so of serious study and work towards it.

The PMP is the flagship certification of PMI, attesting to my experience, education, and knowledge managing projects. In my case, experience includes 13 projects over 7 years for 8 seperate companies. All relevant experience had to be within the last 8 years and cover at least 4500 hours of project work (this applied as I have a Bachelor's degree, w/o the experience requirement is larger) A goal of PMI is to have the PMP recognized around the world as the standard, approaching the status of a CPA to accounting, or 'passing the bar' for an attorney. PMP's work in a variety of industries, construction, and manufacturing notably, and more and more in Information Technology (IT).

Obviously, my specialty is IT projects, though most of my project are really business project with a technology component. I plan to continue along my current career path, working with the State of California - Department of Corrections & Rehabilitation. I really believe that applying basic levels of project management to the smaller efforts I see will improve results. Basic definitions of scope, and schedule would prevent many of our smaller efforts seem to trail off endlessly. There is also a huge opportunity to parametric-ize smaller IT projects. Projects where the planning, documentation, and other 'overhead', might on paper seem to overwhelm the actual effort to produce a little benefit, but the resulting process (report, database, document, etc.) might be used 10, 100, 1000 times over.

Learning the PMI material even in the last year has allowed me to manage projects to more success. The PMP has just given me more 'tools in the toolbox' from which to manage projects, especially communications planning, and human resource mgmt. I can't say I've completely rethought my approach to scope, time and cost aspects. They were challenging when not having really any formal guide as to how to manage them, and they are challenging with such. Really the challenge on all of the areas is managing customer expectations, and learning the PMBOK gives me new ways to get those expectations out on the table.

Exam Prep


There's already a lot of mindshare out there on strategies to pass the exam, but I'll just add, more than any exam I've ever taken, it's not one you can 'cram' for. You have to just know the material and live it and breathe it, for at least some portion of you full-time activities, even if applying the PMBOK to a small or even personal project seems superfluous. That being said, I did find it advantageous to keep the PMI material at the 'top of my head' for the weeks up to the exam. This might not be your learning style, but it helped me more to hand-write, or type, tables of processes, study sheets, lists o' things, etc. than to read a pre-prepared one. Also, taking advice from an instructor with IIL, D.W. Nesper, reading PM network magazine and some of the journals, helped me to think in a reflexively PM-oriented manner about some of the questions. I did put together a quick Access database that helped me organize my thinking about the PM processes, if I get a chance to empty it out, I'll post it here as well.

Thursday, June 5, 2008

Scripting - Create a desktop shortcut to a URL (Updated for IE7)

This article discusses a script that creates a desktop shortcut to a particular URL, with a given custom icon. I know, pretty routine stuff, but when the requirement of a project is to deploy a 'desktop shortcut' to affected customers, I often find that deploying such a script on a web site, such that it can be accessed on demand, or pushing out via a login script or other type of automated technology meets the requirement.

The script:


  1. Copies a custom icon from a network share to the local computer (so the icon shows up, even if the user is offline)

  2. Creates a shortcut on the desktop, references the custom icon

  3. Prompts the user if they wish to launch the site immediately



I developed this script a number of years ago, and recently had the opportunity to re-use it for another app. Unfortunately, upon initial run, it worked except for the custom icon, it was using the default browser icon instead. Thought maybe I broke something because I had also modified the target location of the custom icon to support Vista (as well as 9x-XP). I then tested versions of it I had written in the past that were still in production, those had the same issue.

Turns out the issue was on computers with Internet Explorer 7. Internet Explorer 6 computers were fine. The IE7 computer had a slightly different format in the *.url files placed on the desktop. Didn't discover this until I opened up the .url file in notepad to compare one the script created, to one that was manually created.

In IE6, the url's were basically this format:



[InternetShortcut]
URL=http://intranet/MyApp
IconIndex=0
IconFile=C:\Documents and Settings\TestUser\Application Data\MyApp.ico

While it is in ini file format (note the section 'InternetShortcut' in square brackets), there was only one section, so we were able to get away with simply appending the property=value lines to the file, as in the following snippent



Dim fso
Set fso = WScript.createobject("Scripting.FileSystemObject")

Dim f
Set f = fso.getfile(sURLLinkFile)

Dim contents
contents = ""

Dim line

Dim ots
Set ots = f.openastextstream(1)

'read in contents of the file, except for any existin icon settings
do while not ots.atEndofstream
line = ots.readline
if instr(1,line,"IconIndex",1) = 0 and instr(1,line,"IconFile",1) = 0 then
contents = contents & line & vbcrlf
end if
loop
ots.close

'add the lines for iconindex and file
if not (isnull(iIconIndex) or isnull(IconLocation)) then
if not (isempty(iIconIndex) or isempty(IconLocation)) then
contents=contents &"IconIndex=" & cstr(iIconIndex) & vbcrlf
contents=contents & "IconFile=" & IconLocation
end if
end if

set ots = f.openastextstream(2)
ots.write contents
ots.close

set ots = nothing
set f = nothing
set fso = nothing

The IE7 version has added a section (notice the GUID in square brackets on the 4th line) and a few more properties


So, absent any change in the script, the .URL files were coming out as:



[InternetShortcut]
URL=http://intranet/MyApp
IDList=
HotKey=0
[{000214A0-0000-0000-C000-000000000046}]
Prop3=19,2
IconIndex=0
IconFile=C:\Documents and Settings\TestUser\Application Data\MyApp.ico

With the Icon reference essentially in the 'wrong' section, no .url appears with the standard browser icon. This appeared to be true regardless of default browser (i.e. FireFox, etc.) just having IE7 vs. IE6. (Computers that had created shortcuts while on IE6, then upgrading to IE7, appeared OK.)


To resolve, grabbed some INI read-write functions from http://www.motobit.com, which, after including the functions, cleaned up the core code quite a bit anyway:



WriteINIString "InternetShortcut", "IconIndex", iIconIndex, sURLLinkFile
WriteINIString "InternetShortcut", "IconFile", IconLocation, sURLLinkFile

So we end up with:



[InternetShortcut]
URL=http://intranet/MyApp
IDList=
HotKey=0
IconIndex=0
IconFile=C:\Documents and Settings\TestUser\Application Data\MyApp.ico
[{000214A0-0000-0000-C000-000000000046}]
Prop3=19,2

Excellent stuff, right? So here's the full code:



'-------------------------------------------------------------------------------
'Creates a desktop 'shortcut', or 'icon' to a given URL
'Runs with limited-rights user, so you can give users the ability to create a
'shortcut on-demand, or roll out with a login script or similar automated device
' Author Chris Anderson cander@realworldis.com
' - Modified the script to drop the custom icon, if specified into the current users 'app data' folder
' more logo-compliant than the old version which hard-coded c:\program files\. it will now run under a
' limited rights user
' - Added prompt to launch the site after creating the shortcut
' - Modified the shortcut creation to be compatible with IE7, the previous version would create the shortcut
' successfully, but the custom icon would not be used.
'-------------------------------------------------------------------------------
Option Explicit

Const URL_SHORTCUT_NAME = "My Intranet Application"
Const TARGET_PATH = "http://intranet/MyApp"
Const ICON_REMOTE_FILE_PATH = "\\server\public\MyApp\MyApp.ico"

'the icon index can allow you to reference an ico resource in the file with many resources (i.e. and .exe or .dll)
'if using a seperate .ico file, set to 0
Dim iIconIndex
iIconIndex = 0

Dim sLocalIconFilePath

'copy icon locally so it shows up regardless of network connection
'logo-compliancy - copy the file to the current users 'Application Data' as they should have write access even when not-admin
CopyIconToLocalApplicationData ICON_REMOTE_FILE_PATH, sLocalIconFilePath

'the script then creates a shortcut on their desktop, to the target url, with the local path of the icon
CreateURLShortcut URL_SHORTCUT_NAME, TARGET_PATH, sLocalIconFilePath

'prompt the user to open the shortcut now
Dim sMessage
sMessage = "There is now an icon on your desktop for " & URL_SHORTCUT_NAME & ". Would you like to open it now?"

if MsgBox(sMessage,vbYesNo) = vbYes then
OpenPath TARGET_PATH
end if

Private Sub CreateURLShortcut(ByVal ShortcutName, Byval TargetPath, ByVal IconLocation)

Dim WSHShell
Set WSHShell = createobject("wscript.shell")

Dim sDesktop
sDesktop= WSHShell.specialfolders("Desktop")

Dim oURLLink
Set oURLLink = WSHShell.createshortcut(sDesktop & "\" & ShortcutName & ".url")
oURLLink.targetpath = TargetPath
oURLLink.save

Dim sURLLinkFile
sURLLinkFile = oURLLink.FullName

'set the icon file
if Not IsEmpty(iIconIndex) AND NOT IsEmpty(IconLocation) Then

'WScript.Echo IconLocation

WriteINIString "InternetShortcut", "IconIndex", iIconIndex, sURLLinkFile
WriteINIString "InternetShortcut", "IconFile", IconLocation, sURLLinkFile

end if

set oURLLink = nothing
set WSHShell = nothing

End Sub

Private Sub CopyIconToLocalApplicationData(Byval RemotePath, ByRef LocalPath)

Dim WSHShell
Set WSHShell = WScript.CreateObject("WScript.Shell")

Dim fso
Set fso = CreateObject("Scripting.filesystemobject")

Dim objFile
Set objFile = fso.GetFile(RemotePath)

'local path will be directly under the users app data
LocalPath = WSHShell.ExpandEnvironmentStrings("%AppData%") & "\" & fso.GetFileName(objFile)

fso.CopyFile RemotePath, LocalPath

Set fso = Nothing

Set WSHShell = Nothing

End Sub

private sub OpenPath(byval TargetPath)

Dim WSHShell
Set WSHShell = createobject("shell.application")

WSHShell.Open TargetPath

set WSHShell = nothing

end sub


'----------------------------------------------------------------
'INI File Handling

'Work with INI files In VBS (ASP/WSH)
'v1.00
'2003 Antonin Foller, PSTRUH Software, http://www.motobit.com
'Function GetINIString(Section, KeyName, Default, FileName)
'Sub WriteINIString(Section, KeyName, Value, FileName)
'----------------------------------------------------------------

Sub WriteINIString(Section, KeyName, Value, FileName)
Dim INIContents, PosSection, PosEndSection

'Get contents of the INI file As a string
INIContents = GetFile(FileName)

'Find section
PosSection = InStr(1, INIContents, "[" & Section & "]", vbTextCompare)
If PosSection>0 Then
'Section exists. Find end of section
PosEndSection = InStr(PosSection, INIContents, vbCrLf & "[")
'?Is this last section?
If PosEndSection = 0 Then PosEndSection = Len(INIContents)+1

'Separate section contents
Dim OldsContents, NewsContents, Line
Dim sKeyName, Found
OldsContents = Mid(INIContents, PosSection, PosEndSection - PosSection)
OldsContents = split(OldsContents, vbCrLf)

'Temp variable To find a Key
sKeyName = LCase(KeyName & "=")

'Enumerate section lines
For Each Line In OldsContents
If LCase(Left(Line, Len(sKeyName))) = sKeyName Then
Line = KeyName & "=" & Value
Found = True
End If
NewsContents = NewsContents & Line & vbCrLf
Next

If isempty(Found) Then
'key Not found - add it at the end of section
NewsContents = NewsContents & KeyName & "=" & Value
Else
'remove last vbCrLf - the vbCrLf is at PosEndSection
NewsContents = Left(NewsContents, Len(NewsContents) - 2)
End If

'Combine pre-section, new section And post-section data.
INIContents = Left(INIContents, PosSection-1) & _
NewsContents & Mid(INIContents, PosEndSection)
else'if PosSection>0 Then
'Section Not found. Add section data at the end of file contents.
If Right(INIContents, 2) <> vbCrLf And Len(INIContents)>0 Then
INIContents = INIContents & vbCrLf
End If
INIContents = INIContents & "[" & Section & "]" & vbCrLf & _
KeyName & "=" & Value
end if'if PosSection>0 Then
WriteFile FileName, INIContents
End Sub

Function GetINIString(Section, KeyName, Default, FileName)
Dim INIContents, PosSection, PosEndSection, sContents, Value, Found

'Get contents of the INI file As a string
INIContents = GetFile(FileName)

'Find section
PosSection = InStr(1, INIContents, "[" & Section & "]", vbTextCompare)
If PosSection>0 Then
'Section exists. Find end of section
PosEndSection = InStr(PosSection, INIContents, vbCrLf & "[")
'?Is this last section?
If PosEndSection = 0 Then PosEndSection = Len(INIContents)+1

'Separate section contents
sContents = Mid(INIContents, PosSection, PosEndSection - PosSection)

If InStr(1, sContents, vbCrLf & KeyName & "=", vbTextCompare)>0 Then
Found = True
'Separate value of a key.
Value = SeparateField(sContents, vbCrLf & KeyName & "=", vbCrLf)
End If
End If
If isempty(Found) Then Value = Default
GetINIString = Value
End Function

'Separates one field between sStart And sEnd
Function SeparateField(ByVal sFrom, ByVal sStart, ByVal sEnd)
Dim PosB: PosB = InStr(1, sFrom, sStart, 1)
If PosB > 0 Then
PosB = PosB + Len(sStart)
Dim PosE: PosE = InStr(PosB, sFrom, sEnd, 1)
If PosE = 0 Then PosE = InStr(PosB, sFrom, vbCrLf, 1)
If PosE = 0 Then PosE = Len(sFrom) + 1
SeparateField = Mid(sFrom, PosB, PosE - PosB)
End If
End Function


'File functions
Function GetFile(ByVal FileName)
Dim FS: Set FS = CreateObject("Scripting.FileSystemObject")
'Go To windows folder If full path Not specified.
If InStr(FileName, ":\") = 0 And Left (FileName,2)<>"\\" Then
FileName = FS.GetSpecialFolder(0) & "\" & FileName
End If
On Error Resume Next

GetFile = FS.OpenTextFile(FileName).ReadAll
End Function

Function WriteFile(ByVal FileName, ByVal Contents)

Dim FS: Set FS = CreateObject("Scripting.FileSystemObject")
'On Error Resume Next

'Go To windows folder If full path Not specified.
If InStr(FileName, ":\") = 0 And Left (FileName,2)<>"\\" Then
FileName = FS.GetSpecialFolder(0) & "\" & FileName
End If

Dim OutStream: Set OutStream = FS.OpenTextFile(FileName, 2, True)
OutStream.Write Contents
End Function

Tuesday, May 20, 2008

PowerShell - Converting a script

As a further exercise with PowerShell, I've converted the 'resetLastLogin' script (http://www.microsoft.com/technet/scriptcenter/csc/scripts/desktop/settings/cscds075.mspx) from VBScript to PowerShell. The script resets the 'default user name' on the specified computer. Basically, if you as an admin logged in to someone's computer, it may show your name as the last to login, confusing the regular user of the computer. This script blanks it out, or sets it to the given text. The script can be ran against a remote computer as well, so as to reset the last login from a remote desktop session as well.

Analysis



The first batch of lines in the VBScript handle the arguments, with appropriate defaults:

Dim sComputer
Dim sNewUserName

if WScript.arguments.count > 0 then
sComputer = WScript.arguments(0)
if WScript.arguments.count > 1 then
sNewUserName = WScript.arguments(1)
else
'default the user name to .
sNewUserName = ""
end if
else
'default computer name to local
sComputer = "."
sNewUserName = ""
end if


This was converted using the parameter definition

param ([string]$MachineName, [string]$DefaultUserName)

and applying some defaults

if ($MachineName -eq "") {$MachineName = $env:COMPUTERNAME}
if ($DefaultUserName -eq "") {$DefaultUserName = ''}

while the overall code is much cleaner, had to make non-intuitive use of the "-eq" operator for comparison, vs. just an '=' sign. I could have left the "[string]" definition off of the parameters, the script worked fine, but when I later went to echo those values back, they were rendering as blank.

VBScript registry access was similar in that we first get a registry object:

'get the registry object
Set oReg=GetObject("winmgmts:{impersonationLevel=impersonate}!\\" & _
sComputer & "\root\default:StdRegProv")

get the old value

oReg.GetStringValue HKEY_LOCAL_MACHINE, strKeyPath, strValueName, strValue

and set the new value

oReg.SetStringValue HKEY_LOCAL_MACHINE, strKeyPath, strValueName, sNewUserName


In PowerShell, the O-O pattern means that we follow a pattern of getting an object representing the registry:

$reg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('LocalMachine', $MachineName)

, get an object representing the subkey

$regKey = $reg.OpenSubKey($keyPath,$true)

then get the old value (only to display later) and set the new value

$oldValue = $regKey.GetValue($valueName)
$regKey.SetValue($valueName,$DefaultUserName)


The last step simply echoes something back so the user can see what happened. In VBScript

Wscript.echo "Changed DefaultUserName from '" & strValue & "' to '" & sNewUserName & "'"

and in PowerShell

Write-Host "Changed DefaultUserName from $oldValue to $DefaultUserName on $MachineName"


Full Source Code



The VBScript (without introductory comments) is 43 lines:

Dim sComputer
Dim sNewUserName

if WScript.arguments.count > 0 then
sComputer = WScript.arguments(0)
if WScript.arguments.count > 1 then
sNewUserName = WScript.arguments(1)
else
'default the user name to .
sNewUserName = ""
end if
else
'default computer name to local
sComputer = "."
sNewUserName = ""
end if

Dim sNewValue
sNewValue = ""

'constants for registry access
const HKEY_CURRENT_USER = &H80000001
const HKEY_LOCAL_MACHINE = &H80000002

'standard output
Set StdOut = WScript.StdOut

'get the registry object
Set oReg=GetObject("winmgmts:{impersonationLevel=impersonate}!\\" & _
sComputer & "\root\default:StdRegProv")

'set the key to grab
strKeyPath = "SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon"
strValueName = "DefaultUserName"

'get the value
oReg.GetStringValue HKEY_LOCAL_MACHINE, strKeyPath, strValueName, strValue

'set the new value
oReg.SetStringValue HKEY_LOCAL_MACHINE, strKeyPath, strValueName, sNewUserName

'display it
Wscript.echo "Changed DefaultUserName from '" & strValue & "' to '" & sNewUserName & "'"


The PowerShell version is 14 lines:

param ([string]$MachineName, [string]$DefaultUserName)
if ($MachineName -eq "") {$MachineName = $env:COMPUTERNAME}
if ($DefaultUserName -eq "") {$DefaultUserName = ''}

$keyPath = "SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon"
$valueName = "DefaultUserName"

$reg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('LocalMachine', $MachineName)
$regKey = $reg.OpenSubKey($keyPath,$true)

$oldValue = $regKey.GetValue($valueName)
$regKey.SetValue($valueName,$DefaultUserName)

Write-Host "Changed DefaultUserName from $oldValue to $DefaultUserName on $MachineName"

Wednesday, April 2, 2008

Windows PowerShell

Started spending some time with Windows PowerShell, an next generation command shell and scripting environment for Windows. It can be downloaded from Microsoft's site. It's essentially finally an update to the classic DOS commands, providing simple commands for interacting with WMI, the file system, the registry, etc. The syntax is closer to .Net, and return values are actually rich .net data types.

After PowerShell is installed, you get a new 'console' (type 'powershell' in the run box to access) The syntax of PowerShell commands is verb-noun and has a fairly intuitive layout, I think this contrasts nicely from the DOS commands with vestiges of the 8.3 file names. In PowerShell, the following two commands will likely be the first you run:
    • get-command (lists all available commands)
    • get-help {command name} (lists further details on the given command)

DOS vs. VB Script vs. PowerShell
If DOS commands are the 1.0 version of an admin's 'console', powershell could be considered 2.0, and perhaps VBScript (or, yes, any WSH compatible scripting language) would be 1.5. While DOS relied on imperative commands, the admin using it didn't really have to 'think' like a programmer. To do any work with the file system or registry, without having a very obscure DOS batch file, VBScript had to be used. Also to do any work with WMI, VBScript had to be used. However to use VBScript, the admin really did have to think like a programmer, with looping, conditionals, object instantiation and disposal, etc. being for many tasks, critical for use. I've introduced a number of non-programmers to VBScript (WMI, etc.), and the barrier is two-fold, they have to learn the steps to accomplish the task at hand (i.e. retrieving, or setting something through WMI), as well as learn a modest amount of programming concepts they may have never dealt with. Sometimes that is a reason to stick with a DOS batch file to accomplish a step of the task, such as copying a file using 'xcopy' vs. using VBScript where the process may be more elegant and allow for more structure such as error handling (albeit in a rudimentary form)

PowerShell moves back to the DOS batch file type usage scenario, where an admin (non-programmer) can focus on the task at hand, with modest understanding of programming concepts , such as looping , conditionals and variables, and nearly no necessity of understanding objects lifetime concepts. In PowerShell, most commands return and accept objects as simply as they would accept string values.

The first advantage of PowerShell that struck me was the ease of interacting with WMI, even on a remote computers. Note that PowerShell is not a 'replacement' for WMI, any more that VBScript/WSH was WMI. WMI is still the underlying data structure to expose information about a computer. The syntax and library around it is what's changing here. The implicit functions (called cmdlets - "command-lets") provide much of it, and the adoption of a .net style syntax, where basically more stuff can be done on a single line of code helps

For example, this script I had sitting around to determine how much memory was on a remote computer
In VBScript:

strComputer = "ComputerB"
Set wbemServices = GetObject("winmgmts:\\"
& strComputer)
Set wbemObjectSet =
wbemServices.InstancesOf("Win32_LogicalMemoryConfiguration")

For Each wbemObject In wbemObjectSet
WScript.Echo "Total Physical Memory (kb): " &
wbemObject.TotalPhysicalMemory

Next


In PowerShell:
Get-WmiObject Win32_LogicalMemoryConfiguration –computername ComputerB

Albeit, I'm not a fan of terse-ness for terse-ness sakes (or else we would just write everything in Perl :)), but the ability to have a single command do what the most obvious intent, and having the ability to send that output into other commands.

Platforms
PowerShell is support in Windows XP or newer. However, at this point, PowerShell does not ship natively with any Microsoft OS, although it appears it will be an optional install on Windows Server 2008. Since it's not native on my client computers, I anticipate it primarily being used in admin 'push'-style scripts vs. something end-user facing.

Projection
At a very early glance, I'm seeing PowerShell replace the utility of DOS batch files and VBScript/WSH solutions as soon as the platforms ubiquitously support it. Right now, the barrier is computer running Windows 2000 or earlier (with no support for PowerShell), and the fact that even Windows XP computer already have support for DOS batch files and have WSH, while they don't have PowerShell.

For the short to mid-term, there will still be a number of DOS batch files and VBScript solutions for end-user facing needs: Functions that can run without admin privileges, ran by the user on demand, or during their log on or similar event. this will be simply because they most likely already have everything they need to run it.

However, for admin-driven scripts, where you as an IT person will be pushing something from your computer 'out' to other computers, or querying other computers, install PowerShell and try it out, starting with the replacements for some of the basic DOS commands you might now, and evolving into WMI and replacing your VBScript, if you have any. I think you will find it a productive environment to accomplish many common admin tasks.

In a future article, I will detail a line-by-line conversion from one of my more complex VBScripts into PowerShell, and also evaluate some of the PowerShell editors that provide some advanced edit/view features.

Further Reading
Free Windows PowerShell workbook: server administration
Windows PowerShell Getting Started Guide

Sunday, February 24, 2008

Filling in form letters from Access

I ran across a few Access apps earlier last year where the customer had some very simple requests - to update the header, addresses, logos, etc. on the form letters their apps produced. Upon investigating the application, discovered that the original developer had essentially designed Access reports that simply mimic-ed the template the customer used. Textboxes, labels, images, etc. A seamingly reasonable approach, but in this case, with the end users have no Access experience, the text was essentially 'locked' from their point of view... and even if they had, whenever the template changed, the Access developer would have to reproduce their changes. A colleague, not a programmer, asked a simple question and made a simple suggestion - why couldn't they just use their existing Word template, the app could fill it in from the current record.

In digging, found this article, http://msdn2.microsoft.com/en-us/library/aa140082(office.10).aspx, to accomplish just that. Modified it slightly to sit in an code module and take an RecordID to allow the request record to be used, and toggle the template used based on other fields in the record (in this case, the templates could be assumed to contained the same form properties, only the boiler-plate text differed) We shipped the Document Templates in a folder called 'Templates' sitting next to the application.

Advantages
  • The customer can update their own templates using Microsoft Word
  • Further ability to lock down modify access to the application, without risking cutting off manage-ability in the future (i.e. even if all existing staff leave or forget about the app, the template is sitting out for anyone with appropriate permissions to edit)
Disadvantages
  • Having the seperate 'Templates' folder is an extra dependency that could be deleted, forgotten to be moved, etc.
  • A uncareful Word editor could accidentally delete or modify form properties instead of just the boilerplate text
  • Dependency on Word on the end-users side
Alternatives I have used in the past for similar scenarios:
-Scanning in the form and using as a background (more typical with 3rd party forms than the form letters I was dealing with here, can get out of hand in file size)
-The whole embedding the form letter as a Access report
-In the software BriefCase, used a model of exporting data to a text file, then kicking off Word documents with the Mail Merge feature bound to the text file. The user could modify their own templates, or even create new ones using a master list of merge fields (this is probably the richest model I've used yet, just certainly less quick-and-dirtythan the others, a little bit beyond basic Word training needed)
-In SQL Server Reporting Services, laying out a report to emulate a form letter (while having the fewest dependencies, assuming an SSRS environment exists, very awkward to design, as the tool is much more geared to tabular or matrix based reports, and flow layout vs. a fixed paginated layout needed) FWIW, I haven't really ran across a web-based solution that produces good fixed-layout, paginated reports, in any app I've built...

I've used the Word Form.Properties fill method in 3-4 Access applications, where I've needed to tack in a quick form letter reporting solution, was able to ship and control a whole 'folder', vs. having to ship a single file and/or had to prepare a more full-fledged installer package. I would recommend it in closed environments where you can assume that Word is standard (which is probably a good bet if Microsoft Access available)

Future steps, are to tie the model into .Net apps, which is hopefully where some of the Access apps will be going for other reasons. In Access, I could see pursuing some paths such as storing the templates in the db as BLOBS, or building something that would automatically download the templates if missing or updated... Not sure what bang-for-buck there is there. In a .Net app in the same environment, ClickOnce would manage the templates as dependencies.

Tuesday, February 19, 2008

Remote Desktop Gadget 2.1

Remote Desktop Gadget 2.1

Grabbed the updated version of the Remote Desktop Gadget for the Vista Sidebar. The 1.x version was really the only alternative I found that worked for me. Other tools out there that have features for looking up an OU, etc.: great concept, but they always seem to lockup. This tool seems very simple and clean. The 2.x version added 'History', in the form of a drop-down box when you expand the gadget. Still very lightweight, with this new feature.

Friday, February 8, 2008

Installing the AD Admin pack in Vista

http://blogs.msdn.com/nickmac/archive/2007/03/06/installing-adminpak-on-windows-vista.aspx
This link shows how to install the Active Directory (AD) Admin pack in Windows Vista

Snippet for launching an Click-Once application

While a less common deployment scenario, perhaps. This code will launch an application installed through Click-Once. (Notably, assuming it is already installed)

Dim runningProc As System.Diagnostics.Process
runningProc = New
System.Diagnostics.Process

'Do not receive an event when the process
exists
runningProc.EnableRaisingEvents = False

runningProc.Start("c:\Documents and Settings\" & UserName &
"\Start Menu\AppName.appref-ms")


What is an "Application Reference"?

Sunday, January 20, 2008

Aside

Complete aside... this song got stuck in my head after viewing... He actually seemed pretty good to me :)