Friday, November 21, 2008

Heading back to Japan

I'm sitting at Vancouver airport in the Air Canada business class lounge waiting for my flight back to Tokyo. I really lucked out on weather this week, dry and mostly sunny the whole time. It's raining at a solid, seady downpour at the moment but I really don't have to care about that since I'm inside.

Noon must be a popular departure time for Air Canada because the lounge is packed. I know this isn't all for my plane since buisness class isn't that big. Since I wasn't sure about the traffic and the boarder, I got here quite early. It gave me a chance to catch up on work related stuff.

It was a fun break and I had a chance to meet everyone. Not sure when my next trip is but hopefully not too long from now.

Sunday, November 16, 2008

PowerShell profiles

Powershell has one oddity that is both useful but somewhat wierd. when you first launch PowerShell, it looks for "profile scripts" and if they are found, they are launched automatically. That's useful, but the location and filename of this setup script is hard-coded and fixed. The script located at [All Users]\[All Users Documents]\WindowsPowerShell\Microsoft.PowerShell_Profile.PS1 and [Current User]\[My Documents]\WindowsPowerShell\Microsoft.PowerShell_Profile.PS1 are run. This allows you to pre-load and preconfigure a variety of stuff, which is nice. Why they are in a hard coded place under My Documents with such a long name, I don't know.

As a side note, PowerShell will only run digitally signed scripts so you might need change that setting to create scripts of your own. You should probably just make an internal CA for signing scripts, though.

Another interesting thing is the transcript function. The transcript will record to text file all of the contents of the shell window. Whatever you type and whatever is reported to the screen will be recorded. I think that is very useful, so I've come up with a profile script that will start the transcript automatically. Of course, that's a lot of text files, so I have the script clean that up for me, too.

Here is my script, colored and highlighted by PowerGUI (an open source PowerShell tool). I may have to experiment with the formatting to get it work correctly in the blog screen.

[string]$TimeStamp= get-date -uformat "%Y-%m-%d at %H%M%S"
$MaxAge
= New-TimeSpan -days 7
$LaunchTime
= Get-Date
$MyDocsPath
= Get-ItemProperty "HKCU:\Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders"
$TranscriptFolder
= $MyDocsPath.Personal + "\WindowsPowerShell"
[
string]$TranscriptPath = $MyDocsPath.Personal + "\WindowsPowerShell\" + $TimeStamp + " Transcript.txt"
[
string]$LaunchPath = $MyDocsPath.Personal + "\Script Experiments"
start-transcript
$TranscriptPath
get-childitem -Path $TranscriptFolder -Filter *.txt | where { ($LaunchTime - $_.CreationTime ) -gt $MaxAge } | Remove-Item
cd
$LaunchPath


The $TimeStamp variable is a date time formatted YYYYMMDD at HHDDSS so that the transcript that gets created has a unique, useful name.

The $MaxAge is how long I want to keep the transcripts. Date comparisions in PowerShell work completely differently than VBScript so you have to create a new date-time object for comparisons. It took a lot of experiments to get that working the way I wanted it to.

I read the location of My Documents from registry. I picked the registry so that I can be sure to grab the path of a relocated My Documents folder. I haven't tested this with a network-relocated My Documents folder, yet. One thing about reading the registry from PowerShell, you grab the key (the folder), then you retrive each value as a property. So, you'll notice that I grab the "Shell Folders" as $MyDocsPath and then get the value for the entry "Personal" by requesting the $MyDocsPath.Personal value. That is definately different than VBScript.

The $LaunchPath variable is just where I happen to keep my scripts, you would need to change this or delete this to match your preferences.

The script than purges any *Transcript.txt files that are older than the max age variable, starts the transcript for the current session, and changes the current directory to what I put into the $LaunchPath variable.

Let me know what you think...

Back in Seattle...

...well, Anacortes at least.

Flew in from Las Vegas to Vancouver BC and then rented a car to drive down to Washington. Amazingly, there was only one car in front of me in line at the border. For all intents and purposes, I only waited one minute (maybe two) for a border crossing on a Friday. That's never happened before.

Saturday, November 15, 2008

Airport wireless

I'm sitting in Las Vegas airport and they offer free wireless throughout the terminal. The Vancouver BC airport offers free wireless, too, and it got me thinking - why do so many airports only have as a paid service? With DSL costs and equipment costs dropping all of the time, I have a hard time understanding why an airport can't provide it. If you were to take one of the satelite terminals at Sea-Tac as an example, I think you could cover that with two Cisco 1200 series access points with high gain antennas. Add a NetScreen or WatchGuard firewall and a DSL line, you could support 200+ users. Upfront costs would be about $3,000 (probably less), monthly costs for the connection would run less than a $100 per month, and warrantee support would cost less than $300 per year.

Isn't that a small investment for traveler convenience?

Friday, November 14, 2008

ILM from Microsft

I attended a session on Identity Lifecycle Management(ILM) for SharePoint and we walked through the process of configuring ILM and it is a pretty complicated system. Unfortunately, this is a mission critical service for my current company that controls everything about Active Directory. I wonder if I should be trying to learn that system or if I should stick with Exchange and SharePoint...

Thursday, November 13, 2008

Cirque du Soliel show

Last night, I went to the Cirque du Soliel show at the MGM Grand called KA. It was really a good show but I was surprised at how short it was, however. I have been to several traveling shows and they were all two act, two-hour plus shows whereas KA was only 90 minutes long. It was still a $90 ticket, though.

Still, the show was worth the ticket price.

Presentation on how Microsoft deployed Exchange 2007

Harold Wong presented a seminar on how Microsoft deployed Exchange 2007 internally and it was an interesting presentation. They have 150,000 users worldwide so the scale is quite large but they took the time to do some price/performance/benefit experiments that produced some surprising conclusions. The ones that seemed unusual to me are:

  • Exchange 2007 mailbox servers are typically 2 CPU / dual core servers, 24 gigs of RAM, and large Direct Attached SCSI arrays with 2.5 inch SFF, 10,000 RPM, 146 gigabyte disks.
  • They are not Window clustered servers. Each server is part of an Exchange 2007 CCR cluster but the server itself is not a “classic” cluster.
  • There is no SAN and no shared storage.
  • With 10 terabytes of raw disk space, they have one server support between 4,000 and 6,500 users with 1gig and 2 gig mailbox limits
  • Site to site replication via an SCR cluster is only partially implemented.
  • They have chosen not to split CCRs across a WAN because of the way CAS servers and hub servers load balance. Both parts of the CCR need sit on the same subnet and AD site and their associated hub servers need to do the same. Since the CAS servers load balance automatically, roughly half of your clients will always be crossing the WAN to get from the CAS server to the mailbox server.
  • Tests with 5400 RPM SATA arrays showed that Exchange could easily run on very slow hardware. They felt that they still kept the 10,000 rpm SCSI because of they could support a higher number of users at their preferred minimum response time for lower cost per user with the SCSI compared to the SATA. However, for environments with less than a thousand mailboxes, SATA would be perfectly acceptable for most organizations. These tests were run several years ago so newer SATA drives are probably even better values now.

I also attended a session on deploying large mailboxes in an economical way. This presentation referenced a lot of statistics produced by Microsoft and Dell about costs and impacts. Based on that data, the cost per user for 2 gigabyte mailboxes was only 25% higher than the cost per mailbox at 250 megabytes. The Microsoft design team is currently testing with 10 gigabyte mailbox limits to see what the impacts are to operations. They brought up some interesting points about large mailboxes that I hadn’t thought of:

  • If you give them a large mailbox, there is no archive, everything is live. If everything is “live”, then everything is reachable from every access medium (OWA, Outlook, OMA, etc.)
  • Server side data is backed up, local data is not
  • Server side data is discoverable in a lawsuit, local data is not
  • Server side data is access protected, local data is not.

I think I will propose a 10 gigabyte structure for my current company just to see what the cost impacts really are.

First impression of conference

The Devconnections conference seems to be really well organized. It is also quite a bit larger than I thought it would be. I took one of the preconference sessions for PowerShell scripting. It was actually a two full-day classes with a lot of hands on labs. I use VBScript for a lot of administrative tasks but all of my experience is self taught – I really didn’t want to redo all of that plain for PowerShell. The instructor was Don Jones, a Microsoft MVP and author of several books, and he was a really good teacher.

The class used a Windows 2008 Active Directory domain controller in a virtual machine for the PowerShell lessons. One oddity with the current version of PowerShell is that there are no commandlets from Microsoft for manipulating Active Directory. However, Quest Software has developed a set that they distribute for free that are pretty good. They were also smart enough to use names at are unlikely to conflict with the versions that Microsoft are bound to release eventually.

One of the best things that I learned about PowerShell is that you can call any existing command line command, program, or other executable from inside PowerShell. You can use PowerShell to grab a whole bunch of information, shove that into PowerShell variables, and then pass those variables as arguments to other programs. That should make it a lot more flexible then I originally thought.

PowerShell is almost too flexible, however. Since you can do almost anything, you have a hard time getting it to do what you actually want it to do.

Wednesday, November 05, 2008

Heading to Las Vegas

I will be heading out to Las Vegas for the www.Devconnections.com seminars on Friday, November 7th, at about 5pm (Tokyo time). The conference lasts for a full week since I added a two-day course in Power Shell scripting. Since Microsoft intends that to replace VBScript for day to day administration work, I figured I had better learn more about it. I learned about VBScript through simple experimentation - that was painful, so I wanted to get some class time for the replacement. If I learn anything good, I'll try to post it here or my www.SBWorks.com site.

I will be in Seattle from November 14th through the 20th and I hope to have lunch or dinner with as many people as possible. Let me know if want to meet.