2008 Scripting Games Statistics

Now that the 2008 Scripting Games are over, I was wondering how the various scripting languages broke down in terms of individual participation.  I contacted fellow MVP Marco Shaw about this a few weeks ago because last year he wrote a script that would generate a nice chart using PowerGadgets showing the breakdown of the 2007 Scripting Games participation by division for each country.  He had been working on running his old script against this year’s results, and was kind enough to let me have his work in progress to experiment with myself (thanks Marco!).

After tweaking the script off and on (more off than on) over the past few weeks I’ve managed to get the results I was looking for.  The following screenshot shows two charts from the results of each of the last two years of the Scripting Games, all generated using PowerGadgets.  The charts on the left show the breakdown of individual participation by country for the top 10 countries (where the top 10 countries are defined by those with the most unique participants across all divisions), sorted alphabetically.  The charts on the right show the number of unique participants in each division.  The 2007 results are on the top, and the 2008 results are on the bottom.

ScriptingGamesStatisticsDashboard

The results are pretty interesting.  Not surprisingly, the charts show that PowerShell is growing in popularity.  Last year there were 1/3 as many participants in the PowerShell categories as in the VBScript categories.  This year that gap has narrowed, with PowerShell participation climbing to just under ½ of the VBScript participation.  The charts also show that there were only two changes in the top 10 participating countries since last year, and that VBScript wasn’t the scripting language of choice in all top 10 countries in either year.

In addition to the charts that are output, my updated version of Marco’s script also outputs some general statistical information for the years that it is being run against.  From this I can see that the number of individual participants has increased from 510 in 2007 to 709 in 2008, with the number of active participants (where an active participant is defined as one that participated in 5 or more events) increasing from 378 in 2007 to 563 in 2008.

The script used to generate these results can be found here.

All in all, the Scripting Games seem to be increasing in popularity year over year which is likely a trend that will continue as PowerShell and other scripting languages continue to gain traction.  It will be interesting to see how things pan out next year!

Kirk out.

Share it:

P.S. One of the many things I was involved in while I wasn’t blogging during the month of February was the 2008 Scripting Games.  A while back Scripting Guy Greg Stemp invited me to be a guest commentator for this years games (thanks Greg!) and I was assigned Advanced Windows PowerShell Event #5.  While I unfortunately didn’t have time to participate in the other events this year, I did find some spare time during a train trip to Toronto so I wrote my solution for the event on the train.  The games are all done for this year, but if you’re interested in my solution, it can be found here.

P.P.S. I’m trying out using Windows Live SkyDrive as the site from which to share ps1 files.  If you have any problem viewing the script file I’ve linked to in this article, please let me know.

Learn about PowerShell at Ottawa Code Camp 2008!

2008 marks the first year that Ottawa will be hosting a Code Camp event.  A code camp event is a free one day event by developers for developers.  It’s a great place to spend a Saturday learning about developer-related material from your peers.  This years event takes place on April 5, 2008 at the Algonquin College on the corner of Baseline and Woodroffe.

I’ll be presenting a PowerShell session at the Ottawa Code Camp 2008 event, titled: “What is PowerShell and what opportunties does it provide to a developer?”.  It will run about 60-70 minutes long, which isn’t much time considering what I want to present, so it will likely be a fun presentation as I try to pack in a lot of information in a little bit of time.  I’m hoping to whet your PowerShell appetite that you didn’t know you had while I show you what PowerShell is and how you can use PowerShell for rapid prototyping of .NET code, test-driven development, and support purposes.

Of course there are many other sessions worth attending too.  All sessions at the Code Camp will be a great place to start learning new technologies and ask questions.

You can find out more about the event, the speakers, the sessions, and how to register on the official Ottawa Code Camp site.

I hope to see you there!

Kirk out.

Share it:

PowerShell Deep Dive: Using $MyInvocation and Invoke-Expression to support dot-sourcing and direct invocation in shared PowerShell scripts

When creating PowerShell script (ps1) files to share with the community, there are a few different ways you can configure their intended use.  You can configure a ps1 file so that it contains one or more functions and/or variables that must be loaded into PowerShell before they can be used.  This loading is done via a technique called dot sourcing.  Alternatively you can make the body of the ps1 file be the script itself that you want to share with the community without encapsulating it in a function.  Using this configuration, your script consumers will be required to invoke the script using the absolute or relative path to your ps1 file, prefixing it with the call operator (&) and wrapping it in quotation marks if the path contains a space.  Let’s look at each of these in more detail and some advantages to each approach.

Dot-sourcing a ps1 file is like running the PowerShell script it contains inline in the current scope.  You can pass in parameters when you dot-source a ps1 file, or you can dot-source it by itself.  To dot-source a ps1 file you must use the full absolute or relative path to that file.  Aside from the handling of any parameters, the PowerShell script inside the ps1 file is run as if you typed it in manually into the current scope.  An advantage to this approach is that the variables and functions within the ps1 file that use the default scope will be declared in the current scope and therefore they will be available afterwards without requiring users to know the location of the script file.  This allows users to dot-source a ps1 file in their profile and have the functions and or variables they contain available to them in every PowerShell session they open.  If you had a ps1 file with the path ‘C:\My Scripts\MyScript.ps1’, you would dot-source it like this:

. ‘C:\My Scripts\MyScript.ps1’

Before I get to invoking scripts directly, I need to make an important note about dot-sourcing script files.  Users need to be careful when dot-sourcing script files, because while it is possible to dot-source a script that was intended to be invoked and have it appear to function the same as if you had invoked it, passing parameters and having the script within appear to run as expected, this is not a good practice.  Only dot-source ps1 files containing functions and variables you want available in your current session.  If the ps1 file you are using was intended to be invoked and not dot-sourced, steer clear of the dot-source operator.  Otherwise you risk leaving crumbs (variables and functions) of the script files you dot source behind in your current session, some of which may have been intended to be deleted when they went out of scope (secure strings used to temporarily store passwords, for example).  Since the current scope is the root scope, these won’t go out of scope until you close PowerShell.  I have seen users dot-source ps1 files while passing parameters many times in the online community, and those users should be using the call operator instead — not a good idea.  Now back to invoking scripts directly…

Invoking a script directly is akin to calling a function.  You can pass in parameters when you invoke a ps1 file, or you can invoke the ps1 file by itself.  To invoke a ps1 file you must use the full absolute or relative path to that file.  If that path contains one or more spaces in it, it must be wrapped in quotation marks and the call operator (&) must be used.  Otherwise it will just be treated as a string and output to the console (note: it is a good practice to always use the call operator when invoking a script this way so that it doesn’t matter if spaces are in the path or not — it will just work).  When you invoke a ps1 file, a child scope is created and the contents of that ps1 file are executed within that child scope.  An advantage to this approach is that the script file doesn’t leave anything behind after it is run unless it explicitly declares a function or variable as global.  This keeps the PowerShell environment clean.  If you had a ps1 file with the path ‘C:\My Scripts\MyScript.ps1’, you would call it like this:

& ‘C:\My Scripts\MyScript.ps1’

Between these two approaches, there is no best practice indicating which is the right one to use.  It seems to simply be a matter of preference.  Unfortunately, for the most part it is the script author’s preference, not the script consumer’s.  For script consumers to get ps1 files they find online in the community working they way they want, they may have to modify the file to get it to dot-source correctly, or to run correctly when invoked using the call operator, or they may just copy and paste the script into their own ps1 file or profile to get it running the way they like.  The end result is that each time a ps1 file is updated by its author, the script consumer may have manual steps to take to get that update in their own environment.

What if ps1 files could be created so that they could support both of these configuration approaches.  What if they would always work as expected whether they were dot-sourced or invoked directly?  And what if you want the functionality that the ps1 file provides to work inside of a pipeline, whether you dot-source it and use a function call or invoke it directly inside your pipeline?  Fortunately, PowerShell’s a rich enough scripting language to allow you to do just that.

The first thing you need to do to make this work is to determine how the script file was used.  PowerShell includes a built-in variable called $MyInvocation that allows your script to look at the way it was used.  Among other things, $MyInvocation includes two properties you’ll need to understand when making this work: InvocationName and MyCommand.  InvocationName contains the name of the command that was used to invoke the script.  If you dot-sourced the script, this will contain ‘.’.  If you invoked the script using the call operator, this will contain ‘&’.  If you invoked the script using the path to the script itself, this will contain the exact path you entered, whether it was relative or absolute, UNC or local.  MyCommand contains information that describes the script file itself: the path under which it was found, the name of the script file, and the type of the command (always ExternalScript for ps1 files).  These two pieces of information can be used together to determine how the script was used.  For example, consider a script file called Test-Invocation.ps1 at the root of C on a computer PoShRocks that contains the following script:

if ($MyInvocation.InvocationName -eq &) {
   
Called using operator
}
elseif ($MyInvocation.InvocationName -eq .) {
   
Dot sourced
}
elseif ((Resolve-Path -Path `
    $MyInvocation
.InvocationName).ProviderPath -eq `
   
$MyInvocation.MyCommand.Path) {
   
Called using path $($MyInvocation.InvocationName)
}

Regardless of whether you dot-source Test-Invocation.ps1 or invoke it directly, and regardless of whether you use a relative local path, an absolute local path, or an absolute remote (UNC) path, this script will output how it was used.  Here are a few examples of how you might use this script, with the associated output:

PS C:\> . .\Test-Invocation.ps1
Dot sourced
PS C:\> . C:\Test-Invocation.ps1
Dot sourced
PS C:\> . \\PoShRocks\c$\Test-Invocation.ps1
Dot sourced
PS C:\> & .\Test-Invocation.ps1
Called using operator
PS C:\> & C:\Test-Invocation.ps1
Called using operator
PS C:\> & \\PoShRocks\c$\Test-Invocation.ps1
Called using operator
PS C:\> .\Test-Invocation.ps1
Called using path .\Test-Invocation.ps1
PS C:\> C:\Test-Invocation.ps1
Called using path C:\Test-Invocation.ps1
PS C:\> \\PoShRocks\c$\Test-Invocation.ps1
Called using path \\PoShRocks\c$\Test-Invocation.ps1

As you can see, each time our script knows exactly how it was used, so we can use that to make it behave appropriately in any situation.

Now that we’re armed with that knowledge, let’s add a function to our script that will do something simple, like output the definition of another PowerShell function.  First, we’ll need to write our function:

function Get-Function {
   
param(
       
[string]$name = $(throw Name is required)
   
)
   
if (-not $name) { throw Name cannot be empty }
   
if ($name -match [^a-z0-9-]) {
       
Write-Error Unsupported character found.
    }
elseif ($function = Get-Item -LiteralPath function:$name) {
   
function $name {
`t$($function.Definition)
}

    }
}

This function is pretty straightforward.  You call it passing in the name of a function and it outputs the function definition, including the name, to the console.

The next step is to follow up that function definition with a slightly modified version of our Test-Invocation.ps1 script.  Basically we just want to know if the file was invoked or dot-sourced.  If it was invoked, we want to automatically call our Get-Function function and pass the parameters used during the invocation directly through to the Get-Function function call.  If it was dot-sourced, we don’t want to do any additional work because the function will be imported into the current session so that we can use it without the script file, as intended.  This has the added benefit of preventing users from executing script through dot-sourcing that wasn’t intended to be executed.  Here’s the start of the additional script that we’ll need to put after our Get-Function call:

if ($MyInvocation.InvocationName -ne .) {
   
Get-Function # How do we pass arguments here?
}

This additional piece of script uses a simple if statement to compare $MyInvocation .InvocationName against the dot-source operator.  If they are equal, this portion of the script does nothing, allowing the function to be dot-sourced into the current session without invoking it.  If they are not equal, we know that the script was invoked either directly or using the call operator, so we need to call Get-Function so that the invocation uses the internal function automatically.  But as noted in the comment in the snippet above, how do we pass the arguments that were used during the invocation into the internal function?  There are two possible approaches that I can think of to resolve this.  We could use the param statement at the top of the script to identify the same parameters that are in the Get-Function function.  The problem with this approach is that it’s duplicating code unnecessarily, and I really don’t like duplicating code.  Another approach is to use Invoke-Expression inside of our if statement to pass the parameters received from the invocation of the script directly into the internal function.  The only special trick required in this approach is to only evaluate parameters that start with ‘-‘.  This is necessary so that the parameters of the internal function can be used by name, just like they could if you dot-sourced the script first and then invoked the function.  I think that’s a much better approach, so here’s our updated if statement:

if ($MyInvocation.InvocationName -ne .) {
   
Invoke-Expression Get-Function $($passThruArgs = $args; for ($i = 0; $i -lt $passThruArgs.Count; $i++) {if ($passThruArgs[$i] -match ‘^-‘) {$passThruArgs[$i]} else {`”`$passThruArgs[$i]`”}})
}

In this implementation, if the script file was invoked, Invoke-Expression is used to invoke the Get-Function function, passing arguments received by the script directly through to Get-Function.  And as just mentioned, I use the -match operator to determine if a given argument starts with -, in which case I evaluate it so that I end up calling Get-Function using named variables.  This is a trick that I find applies itself nicely to quite a few situations in PowerShell scripting I do.

At this point, we have a complete script file that can be invoked to execute the internal function directly or dot-sourced to import the internal function into PowerShell, all with a little help from $MyInvocation and Invoke-Expression.  This script can be seen below.

Get-Function.ps1 listing #1:

function Get-Function {
   
param(
        [
string]$name = $(throw Name is required)
    )
   
if (-not $name) { throw Name cannot be empty }
   
if ($name -match [^a-z0-9-]) {
       
Write-Error Unsupported character found in $name.
    }
elseif ($function = Get-Item -LiteralPath function:$name) {
       
function $name {
`t$($function.Definition)
}

    }
}
if ($MyInvocation.InvocationName -ne .) {
   
Invoke-Expression Get-Function $($passThruArgs = $args; for ($i = 0; $i -lt $passThruArgs.Count; $i++) {if ($passThruArgs[$i] -match ‘^-‘) {$passThruArgs[$i]} else {`”`$passThruArgs[$i]`”}})
}

Now, I’m sure you’re thinking that’s great, flexible, etc., but where’s the pipeline support that you mentioned would work as well?  Well, as mentioned earlier, this is also possible in PowerShell although it adds another layer of complexity to the script.  The nice part though is that it will work whether it is used in a pipeline as an invoked ps1 file or as an invoked function that was previously imported by dot-sourcing the ps1 file.  The trick is to use the Begin, Process and End blocks and the $_ variable both in the ps1 file at the root level and in the internal Get-Function function.

At the root scope of the script file, the Begin block is used to declare any functions and variables used in the script.  The process block actually calls the function that is being exposed through the script (in a pipeline if appropriate), and the End block is used for cleanup (although we don’t have any cleanup to do).  Similarly, inside the Get-Function function, the Begin block is used to check parameters that don’t support pipeline input, the Process block is used to check the state of some parameters and actually do the work (using the objects coming down the pipeline if appropriate), and the End block is used for cleanup (although again, we don’t have any).  The end result of adding these to our script and making a few modifications so that users can invoke the script file or the function with -? and get the syntax can be found in Get-Function.ps1 listing #2.

Get-Function.ps1 listing #2:

BEGIN {
  function Get-Function {
   
param(
      [
string]$name = $null
    )
    BEGIN {
      if (($name -contains -?) -or ($args -contains -?)) {
       
SYNTAX | Write-Host
        “Get-Function [-name] <string>
| Write-Host
       
break
      }
    }
    PROCESS {
     
if ($name -and $_) {
       
throw Ambiguous parameter set
      }
elseif ($name) {
       
$name | Get-Function
      }
elseif ($_) {
       
if ($_ -match [^a-z0-9-]) {
         
throw Unsupported character found.
        }
elseif ($function = Get-Item -LiteralPath function:$_) {
         
function $_ {
`t$($function.Definition)
}
        }
      }
else {
       
throw Name cannot be null or empty
      }
    }
    END {
    }
  }
}
PROCESS {
  if ($MyInvocation.InvocationName -ne .) {
   
if ($_) {
     
Invoke-Expression `$_ | Get-Function $($passThruArgs = $args; for ($i = 0; $i -lt $passThruArgs.Count; $i++) {if ($passThruArgs[$i] -match ‘^-‘) {$passThruArgs[$i]} else {`”`$passThruArgs[$i]`”}})
    }
else {
     
Invoke-Expression Get-Function $($passThruArgs = $args; for ($i = 0; $i -lt $passThruArgs.Count; $i++) {if ($passThruArgs[$i] -match ‘^-‘) {$passThruArgs[$i]} else {`”`$passThruArgs[$i]`”}})
    }
  }
}
END {
}

And there you have it.  Now you know how to create versatile ps1 files that you can share with the community that:

  1. Automatically discourage unrecommended usage (executing internal code and processing parameters when dot-sourcing script files not meant to be dot-sourced).
  2. Support importing functions and variables via dot-sourcing.
  3. Support direct invocation via the path and the call operator (if necessary).
  4. Output syntax when called with -?.
  5. Work in the pipeline as both a ps1 file and an imported function.

This all may seem very complicated at first, but once you learn how it works it’s really not that complicated at all.  And hopefully the consumers of your script will thank you for all of your hard work in making it possible.

Thanks for reading!

Kirk out.

Share it:

Using PowerShell in a Vista MUI Environment

I came across an issue recently that took a little while for me to resolve, so I thought I would share it in case someone else encounters the same issue.

If you install Windows PowerShell 1.0 on English Vista and then later install a multi-lingual user interface (MUI) language pack for Vista, PowerShell will continue to display help information in English when you use the get-help cmdlet regardless of what language your account is configured to use.  This appears to happen because installing the Vista MUI language pack doesn’t automatically install the appropriate language files for PowerShell (although my understanding of Vista’s approach to supporting a MUI environment indicates it should).

In order to resolve this and get the appropriate language files installed on your computer, you must do the following:

  1. Uninstall PowerShell using an account that is configured to use the same language PowerShell was installed with (in my case, English).
  2. Reboot your computer (I don’t know if this is absolutely necessary but I did it just in case).
  3. Download and install PowerShell 1.0 again.  This time the extra language files will be installed automatically.

In my case I had applied the French MUI language pack for Vista to an English instalation of 64-bit Vista Enterprise when I first encountered this issue.  I also tried numerous times to uninstall PowerShell but at the time I was using an account configured to use French, and this seemed to result in an incomplete uninstallation of PowerShell (although I didn’t get any warnings or errors on uninstall).  I only knew there was a problem when I tried to install PowerShell again and was informed that PowerShell is not compatible with my version of Windows (not a very helpful error in this case).  Regardless, after rolling back to a restore point pre-uninstall of PowerShell and then following the steps above, I was able to resolve this issue and now I can use PowerShell in French in my Vista MUI environment.  Hopefully this will help someone avoid banging their head against the wall while trying to figure out this problem in the future.

Kirk out.

Share it:

Quality, not quantity, for the most part

When I started this blog last year I set a personal goal to post something to this blog at least once a week.  I wasn’t that busy at the time and it figured like a reasonable thing to do.  Well it’s now been about 6 weeks since I last published anything on this blog, but not for lack of wanting.  Life simply became extraordinarily busy for all of February and the first part of March and there were simply too many higher priorities taking every minute of free time that I could muster for me to justify spending time writing something for my blog.

It’s not that writing a blog post is that complicated.  It’s just that I didn’t want to post just anything.  I tend to prefer posts that are a little less frequent but that hopefully offer a little more value to the reader than just reposting what’s already out there simply because I don’t have time to do anything else.  Quality, not quantity.  That reflects how I look at many things in life.  Perfectionism at its best.  It’s a gift…and a curse. 🙂

Well I think my preference for quality over quantity got the better of me and I’m sure I’ll be crazy busy like I was in February again in the future, so its time to rethink my approach to blogging.  I have lots of ideas on how to approach this, but I’ll need to experiment a little to see what works best.  Essentially I’m simply going to try and find a better balance between the meatier posts that I like to do to share the results my PowerShell research with you and lighter, shorter posts about what’s going on in the PowerShell space and about the cool things I’m working on in PowerGUI to maintain a better blog continuity going forward.  Hopefully you won’t see a break in posts like this happen again in the future.

If you stuck around, waiting for an update from me, thanks.  I’m going to do my best to make you happy that you did.

Kirk out.

Technorati Tags: , , ,

Share it:

.NET Rocks! episode #311 and RunAs Radio episode #42 are now available

Just to follow-up on Monday’s post, the .NET Rocks! and RunAs recordings are now available.

.NET Rocks! episode #311 can be found here.

RunAs Radio episode #42 can be found here.

And in case you are just reading this now and didn’t see my other post, dnrTV episode #99 was posted on Sunday and can be found here.

If you have some time and want to learn more about PowerShell and PowerGUI, I encourage you to give these shows a listen.

Thanks again to Carl and Richard for inviting me to be on these shows and for setting up Kirk week on Pwop! 🙂  This has been a real treat.

As with anything I do, feedback is welcome and appreciated, positive or negative.

Kirk out.

P.S. If you like screencast or podcast recordings like this and would like to see more, let me know.  And don’t forget the PowerScripting Podcast, a bi-weekly podcast about all things PowerShell.

Technorati Tags: , , , , , ,

Learn about PowerGUI on .NET Rocks!, dnrTV and RunAs Radio

During the past two weeks I’ve had the pleasure of sitting down and chatting with Carl Franklin, Richard Campbell and Greg Hughes of .NET Rocks!, RunAs Radio and dnrTV fame about PowerShell, PowerGUI and the Quest AD Cmdlets.  For those of you who don’t know, .NET Rocks! is an internet audio talk show about .NET and dnrTV is the equivalent in an internet screencast format.  Both of these are targeted at the .NET developer.  RunAs Radio is an internet audio talk show for IT professionals who work with Microsoft products.

I didn’t know what to expect from any of these interviews, but fortunately Carl, Richard and Greg are real pros at this and it was a real treat to chat with them.  I believe the end result is much better than what you get from the typical webcast or screencast because they ask the right questions and great conversation comes out as a result.

I was planning on blogging about this before any of these were posted, but these guys are really on top of things and to my surprise this morning I found out that Carl already posted the recording from dnrTV, so you can watch the episode #99 now.  It is simply called “Kirk Munro on PowerGUI”, and can be found here.  The .NET Rocks! and RunAs Radio sessions are not posted yet, but they should be later this week.  I’ll post an update once they are available.

Whether you’re interested in PowerGUI, PowerShell, .NET or working with Microsoft products in general, I encourage you to watch dnrTV and listen to RunAs Radio and .NET Rocks!  What a great way to spend your morning and evening commute in the subway, on the bus, or in your car (although please don’t watch dnrTV while driving your car…that’s just not a good idea; no matter how much you want to learn what PowerGUI and PowerShell can do for you, they won’t fix your car).

Thanks again to Carl and Richard for getting in touch with me and setting this up!  It’s been a great experience!

Lastly, if you’re interested in listening to a bi-weekly podcast that is specifically about PowerShell, you should also check out the PowerScripting Podcast.  The PowerScripting Podcast is hosted by Jonathan Walz and Hal Rottenberg and if you’re into all things PowerShell like I am, this gives you one more way to stay informed about PowerShell during the daily commute.

Thanks for listening!

Kirk out.

Technorati Tags: , , , , , , ,

PowerGUI 1.0.13 is now available

Yesterday PowerGUI 1.0.13 was made available for download on the PowerGUI community site.  Aside from the many great new features in PowerGUI and the PowerGUI Script Editor (which you can read about here on Richard Siddaway’s blog), I wanted to share some of the details about the enhancements and additions that were made in the area that I am responsible for in this release — the PowerPacks.  Here is a list of the changes that were included:

Local System PowerPack:

  • added description, startup type and logon account to the output properties on the services node
  • replaced hard-coded event log tree with dynamic tree that shows all event logs on the system (note: this doesn’t support the custom views that can be created in the Vista event log viewer yet)
  • added actions to clear all events in an event log, set the maximum size and set the overflow policy
  • updated the Drives node so that drives are automatically grouped by provider type when there are multiple drives on a system
  • fixed issues preventing the browsing of certain drives from working properly
  • added support for viewing the security descriptor and the access control list for files, directories and registry keys
  • added take ownership support for security descriptors
  • added Values link to view the values associated with a registry key and Change Value action to change a registry value
  • added support for an expanded view of environment variables that contain multiple values delimited by semi-colons
  • added open file support
  • added support for signing files from a certificate provider drive

Active Directory PowerPack:

  • replaced Browse the Domain node with Browse Active Directory node; this supports browsing all of Active Directory within PowerGUI, not just the Domain Naming Context node
  • added action to delete a computer object from AD
  • added Member Of (Recursive) links for groups, and computers
  • added Member Of link for users

WMI Browser PowerPack (new!):

  • introduced brand new PowerPack for browsing WMI objects on the local computer or remote computers
  • exposed support for managing specific computers via WMI; you just use the Add Connection and Remove Connection actions that are exposed through the root WMI Browser node
  • exposed all WMI objects on a computer; you just browse through the WMI object tree to the one you are looking for and then use the Get WMI Objects link to view the WMI objects of that class type

There is still a lot more to do with the PowerPacks and this is only the beginning.  I’m completely focused on enhancing the PowerPacks that come with PowerGUI, so if you have suggestions, requests, or feedback to offer, I’d be more than happy to hear it — just leave me a comment on this blog.  Or if you are trying to make a PowerPack yourself and want home help or suggestions, you can comment about that here too or just post about it on the PowerGUI Forums.

And if you haven’t downloaded PowerGUI 1.0.13 yet, please give it a try and let me know what you think!

Kirk out.

Technorati Tags: , , , ,

Supporting -Whatif, -Confirm and -Verbose in *localized* scripts

Way back in February last year, Jeffrey Snover posted an article on the PowerShell blog describing how to support -Whatif, -Confirm and -Verbose in PowerShell functions and scripts.  As he states in that article, this is very important because it shows you how you can give consumers of your function or script the same safeguards that they get in PowerShell cmdlets.  If you haven’t read the article and you are sharing scripts that make system changes with others, you should take a few minutes to read it and strongly consider (read: this is really a good idea) using the Should-Process function it describes in those scripts. You can find that article here.

That said, there is one issue that I do have with the implementation of the Should-Process function as posted in that article — it is full of hard-coded English strings when it doesn’t need to be.  At the time of this writing PowerShell is already localized in 10 languages and is already being used heavily internationally, so it makes sense to provide localized PowerShell scripts whenever possible, right?  That might sound daunting, but since the strings used in the output generated by -Whatif, -Confirm and -Verbose are already stored in the PowerShell resource string tables, we should be able to extract them using PowerShell itself and then use the extracted, localized strings in our scripts.

So how do we go about doing this?  The answer is pretty simple: use Get-PSResourceString.

Get-PSResourceString is part of the CmdletExtensionLibrary.ps1 script that I published a little while ago.  It is used to retrieve localized resource strings from the PowerShell engine and host.  Using that function I was able to convert all of the strings that are used in Jeffrey’s version of the Should-Process function into calls to Get-PSResourceString so that they will be localized.  The end result is a localized1 version of the Should-Process function that I have also included in the most recent version of the CmdletExtensionLibrary.ps1 script.

Both of these functions are quite long and not very blog-friendly, so I’ll leave it up to you to open the CmdletExtensionLibrary.ps1 script file in the editor of your choice and take a look at them.  This script originally started as a small project to dig into dynamic parameters, but I’ve found the functions I created in that work so useful myself that I’ve been adding more and more to it, a trend that will likely continue.  If you only want the Get-PSResourceString and Should-Process functions, you can simply copy them out of that script and use them however you like.  And of course, if you see any issues with the functions in this script, have any questions, or want to share suggestions on how to improve it even more, I’m more than happy to listen.

Kirk out.

1 Note that while I’m trying to provide localized scripts as well as functions to facilitate writing localized scripts to support the international PowerShell community, all documentation for these functions is only in English and there are a few hard-coded strings only in English for variable and alias descriptions and one output message in English for the script file itself.  Aside from those few exceptions (that you’re not likely to see anyway), if you’re using a localized version of PowerShell the functions in the CmdletExtensionLibrary.ps1 file should properly support your localized version.

Technorati Tags: , , , ,

Windows Live Mail – my new newsreader of choice

I spend a fair amount of time answering questions in the Microsoft PowerShell discussion group (*cough* newsgroup).  The easiest way I have found to do this is over the past year is to use good ole’ Outlook Express (um…dated!).  It has the simplicity and the efficiency that allows me to stay on top of what’s going on with little effort.

Well recently Garth Jones suggested I use Windows Mail instead, but that comes with Vista and I don’t use Vista on the majority of machines I work on yet.  Fortunately Windows Live Mail is available, and it is both the successor to Windows Mail on Vista and Outlook Express on Windows XP so I decided to give it a try.  Well I’m quite happy I did.  It maintains the simplicity that I was used to in Outlook Express while adding many nice new features such as support for voting on posts, identifying the type of message when you post as a comment, question or suggestion, and some light integration with blogging by allowing you to publish a newsgroup message directly to your Windows Live Spaces blog (hopefully in the future this will extend to support integration with Windows Live Writer so that you can then publish to a blog on other blog hosting sites as well).  And it’s free.  All in all, it’s quite a nice improvement over Outlook Express and I recommend giving it a try, whether you need to post a message on a newsgroup or if you just want to read existing posts.

There is also one other nice little feature that Garth brought to my attention.  Posts from any Microsoft MVP are identified with a MVP logo if they were posted to the newsgroup using Windows (Live) Mail and if they are viewed using Windows (Live) Mail as well.  Here’s part of a screenshot showing how posts are identified as questions vs comments as well as showing off the nice little MVP logo that appears next to Microsoft MVPs.

Windows Live Mail Newsreader

It’s not a must have feature, but I certainly like it. 🙂

Kirk out.

Technorati Tags: , , ,