Quantcast
Channel: The Official System Center Service Manager Blog
Viewing all 562 articles
Browse latest View live

Tricky Way to Handle Review Activity Approvals with the Exchange Connector

$
0
0

One of our TAP customers, Alexsander Popov, came up with a clever way to handle review activity approvals in email.  The Exchange Connector requires that an email be sent with the work item ID in square brackets in the subject and some configurable keyword that represents approve or reject in the body.  I’ve recommended using [Approved] and [Rejected] in the body.  Normally you would have to provide instructions to the reviewer on what to enter when they reply to the email.  Alexsander removes some of the chance for user error by using a mailto: handler in the email body.  When he sends emails they look like this:

clip_image002

Each of the links is actually a mailto: link so that when the user clicks on the link it will open a new mail and populate the subject and the body with the required text like this:

clip_image002[5]

He did this by creating a notification template which has an <a> tag around the Approved and Rejected links.  The <a> tag has an href attribute that is set to:

mailto:someemailaddress@somecompany.com?subject=[<insert the work item ID here>]&body=[Approved]

We’ve tested this approach in Outlook, OWA, and Windows Phone 7 and it works great!

Thanks for sharing that creative solution Alexsander!


Top Secret Trick–Installing the Portal on an Upgraded SP1 Management Server

$
0
0

If you do the following:

1) Install the SM Management Server RTM

2) Upgrade the Management Server to SP1

3) Try to install the SP1 portals

the setup will block you.  This is kind of annoying!

 

Here is how you workaround it…. (do at your own risk)

1) Backup the encryption key – there is a utility on the CD image in the Tools folder.  Instructions

2) Uninstall the management server.

3) Restore the encryption key using the same utility. Instructions

4) Install the management server again and point to the existing database

5) Install the portal – It works! :)

Partner Demo: Cased Dimensions Service Level Management

$
0
0

Cased Dimensions, a Microsoft partner that offers a SLA management solution on top of Service Manager released their management pack awhile back.  Since then it has seen some additional improvements and new features.  They have put together a demo video so everyone can see what it does at a glance.

Cased Dimension’s solution extends Service Manager to add more comprehensive support for service level management.  They have support for all types of work items, not just incidents.  You can specify business hours and holidays.  You can specify different SLAs by creating criteria by which to apply the SLAs to incidents such as those incidents which are ‘Priority less than 3’.  A more recent feature they have added is the ability to weight specific users so that they have a modified SLA target.  For example – if the target resolution time is normally 20 minutes, you could weight an executive at 50% of target so that the target resolution time would be 10 minutes.

This is really a fantastic solution and is a great addition on top of the relatively basic SLA management capabilities SCSM offers out of the box.  Please enjoy this new demo video!

Reporting: Count of Incidents by Classification Query and How to Get Enumeration Display Strings

$
0
0

There really isn’t a lot of information out there about reporting so I’m just going to start putting some things out here in the blog as I come across them.  Hopefully, these little posts will eventually add up to something meaningful collectively.

This blog post will cover a couple of things –

1) How to join two dimensions – the incident dimension and the incident classification dimension.

2) How to get the display strings for a particular language for a particular enumeration.

Here is the query:

SELECT Strings.DisplayName AS Classification, COUNT(*) AS Incidents
FROM IncidentDimvw Incident
Join  IncidentClassificationvw Classification ON Incident.Classification_IncidentClassificationId = Classification.IncidentClassificationId
Join  DisplayStringDimvw Strings ON Classification.EnumTypeId = Strings.BaseManagedEntityId
WHERE Strings.LanguageCode = 'ENU'
GROUP BY Strings.DisplayName

Here is a sample of the results:

image

A couple of interesting things that are good to know:

All enumerations (aka “Lists”) that are brought over to the data warehouse are stored on dimension tables.  We also call them “Outriggers”.  The display strings for those enumerations are stored on the DisplayString dimension.  You’ll need to join across the enumeration dimension table to the DisplayString dimension table to get the localized display string for a given enumeration value.

Also notice how the query above queries the views (notice the vw at the end of the names) not the tables directly.  This is definitely a recommended best practice.  Although it is unlikely that we will change the table schema drastically, having customers use the views instead of the tables provides us the flexibility to change the table schema if we need to in the future without breaking customers’ reports.

More to come on report authoring…

How to Notify the Assigned to User when an Incident is Updated via the Exchange Connector

$
0
0

I’ve had this question come up a few times now, so to avoid repeating myself I’ll just blog about it! :)

Scenario: Affected user receives an email from the service desk requesting additional information.  The affected user replies to the email providing the additional information.  The Exchange connector picks up the email and updates the action log with the additional information.

Question:  How does the assigned to user get notified that the affected user has provided the required information?

Here’s how…

  1. Decide on some property or value that indicates the user has updated the incident.  For now let’s assume that we are going to add a new status called ‘Updated by Affected User’ as a sub status under Active.  Do this by editing the incident status list in the Lists view.  You could do other things like create your own custom property called ‘Updated by User’ or whatever.
  2. Create an incident template that sets that property from #1.  In this example the template would set the incident status to ‘Updated by Affected User’.  Hint: you can change the status in template mode using the ‘Change Incident Status’ task in the Tasks pane.
  3. In the Exchange connector properties dialog there is a place where you can choose which template to apply when an incident is being updated.  Select the template you created in #2.  Now  - whenever the incident is updated by the Exchange connector this template will also be applied which will change the status to ‘Updated by Affected User’.
  4. Create a notification template for this scenario if you haven’t already done so.
  5. Create an incident event workflow with the criteria: when incident status changes FROM not equal to ‘Updated by Affected User’ TO equal to ‘Updated by Affected User’.  Choose the notification template you created in #4 and the assigned to user.

Governance, Risk, and Compliance (GRC) Management Pack SP1 is RTM!

$
0
0

The Microsoft Solution Accelerator and the System Center teams are pleased to announce the availability of the Microsoft IT GRC Process Management Pack SP1 for System Center Service Manager.

This SP1 update takes advantage of the improved functionality of Microsoft System Center Service Manager 2010 SP1, which provides an integrated platform for automating and adapting an organization’s IT service management best practices.

SP1 updates to the IT GRC Process Management Pack will enable organizations to:

  • Provide timely, insightful views into the health and status of IT governance, risk, and compliance (GRC) programs through dashboard view reports
  • Dynamically identify your IT GRC program assets through business services or user-defined queries
  • Create a customized IT GRC solution that efficiently translates complex regulations and standards into authoritative control objectives and activities that apply to your organization

Download the SP1 release of the IT GRC Process Management Pack from the Microsoft Download Center.  First time installation and upgrades are supported during software setup.

Download the SP1 release of the IT GRC Process Management Pack.

For a complete end-to-end solution, the IT GRC Process Management Pack should be used in combination with System Center Service Manager 2010 SP1 and the IT Compliance Management Series. To learn more, please visit the System Center site.

  • System Center Service Manager provides a customizable and extensible platform that enables customers and partners to implement additional GRC capabilities.
  • The IT GRC Process Management Pack for System Center Service Manager provides end-to-end compliance management and automa­tion for desktop and datacenter computers.
  • The IT Compliance Management Series helps IT pros configure Mi­crosoft products and continuously monitor their compliance to address specific IT GRC requirements.

More Information

Compliance Management Forum

Compliance Solution Accelerators

Security Solution Accelerators

Microsoft Solution Accelerators

Service Manager Portal Source Code Released!

$
0
0

The feedback on the self-service portal has consistently been that it needs to be more customizable.  That is by far the #1 feedback item on SCSM 2010.  The types of desired customizations typically fall into these categories:

  • change the text shown in a label
  • hide buttons or web parts
  • change the style (colors, fonts, etc.)
  • change the layout
  • show more or less information
  • add additional capabilities
  • add languages or allow users to choose which language the portal is displayed in

Most of these things ideally would be configurable options on the web parts and there would be some sort of administrative UI which allowed you to change these things.  Making things like that configurable via an admin experience requires significant development and test time which unfortunately we didn’t have in the 2010 product development cycle.  That is what we are shooting for in the vNext version (currently code named “R2”) of the SCSM portal that we are super excited to show you at the Microsoft Management Summit for the first time.  The vNext portal will be based on SharePoint 2010 (any version including Foundation).  It will be made up of web parts.  You can change the text shown on labels, change fonts, change the style, change the layout, etc. using standard SharePoint administrative experiences and tools.  It will be relatively easy to add new content to the portal by creating web parts that use the SDK and plugging them into the SharePoint site.  You will also be able to add languages or change display strings by modifying the string resource files.  We will let the user choose which language they want to display the console in.  The portal will be a single SharePoint site instead of an analyst portal and an end user portal, but the user experience will be role based so people only see what they need to see.  In short, it will be what we all wished the SCSM 2010 portal could have been.

In the meantime, we know that the lack of customizability of the portal is a pain point for customers right now.  Many of the modifications that customers want to be able to do are relatively trivial to make if you had the source code and a little bit of know how.  To help customers achieve their goals for a customized self-service portal in the near term, we are releasing the source code to the portal today.  Not everybody has Visual Studio and a developer on hand to make these customizations, but a lot of our customers do.  Some of our SCSM administrators, consultants, and SI partners are even developers at heart and are excited about the possibility of making the self-service portal do what they want it to do.  We realize this isn’t ideal to have to modify code to get what you want, but it is better than not being able to modify it all!  Hopefully this will suffice while we are busy building the new portal.

But we also have another fun surprise!  We are also including a newly styled portal in this download that you can deploy with copy/paste of some files.  The new site looks like this:

image

It is styled to look similar to the SharePoint 2010 default site template:

image

Looks a lot better huh?!

This updated version of the portal is also intended to show you some of the possibilities for customizing the portal.  The updated version of the portal has the following modifications:

  • style changes – modified CSS stylesheets
  • when you click in the knowledge search textbox the text actually clears instead of staying there

image (new) vs. (old) image

  • you can update incidents from the portal!

image

  • The “where am I at?” highlighting has been removed so that it doesn’t show step 1 highlighted when you are on step 2.  Do we really think people are going to get lost in a three step wizard?

image

  • I’ve added a new Request New User option and associated web part to show how you can extend the self-service portal to gather inputs that are directly mapped to an automated activity that can drive a workflow.

image

image

Results in a new change request being created with an automated activity that looks like this:

image

image

When the automated activity kicks in it will create a new user in Active Directory using PowerShell:

image

image

I’ve commented most of the code where I have made modifications with “PORTALMOD” so you can search through the code base and easily see all of the modifications that I made.

So – to recap, this new release includes:

  • The source code of the portals that ship with the product
  • A updated version of the portal with the bug fixes and features described above
  • The modified source code for the updated version of the portal
  • A document describing in detail how to modify, build, and deploy the portal

I’ve also attached the MP to this blog post that is need to make the new user provisioning scenario go.  Please don’t use this in production though!  I have spent minimal time testing it as it is just an example of what you could do.  To deploy it you just need to import the MP .xml file and put the workflow .dll file in the C:\Program Files\Microsoft System Center\Service Manager 2010 directory.

I also want to discuss what the roadmap is for the portal.  The SCSM 2010 SP1 portal is an ASP.Net web site that hosts web parts.  Those web parts can be put on a SharePoint site.  The results are not great but you can do it and it can be made to look good now that you have the source code.  So, in that sense what we have done for the SCSM 2010 and what we are doing for the SCSM vNext portal are similar, the only difference being that in vNext we will require SharePoint as the hosting framework for the web parts.  So – in theory you could port your web parts that you develop today to run on the vNext portal since it is all just SharePoint and web parts.  The only little detail there is that the web parts you create today using the source code provided will use a different way of accessing the data than the vNext portal will.  We are still sorting out the details of that and whether or not you could mix/match 2010 web parts with 2012 web parts on the same SharePoint server, but I think we’ll be able to make it work.

Lastly, I just want to make sure that everyone is clear on what this release is.  Think of it as a shortcut to developing your own custom portal using the Service Manager SDK.  We have effectively shown you how to do it and provided some sample code .  Whatever you create is like you created it from scratch.  If there is a bug in a modified portal then you won’t be able to call Microsoft support about it.  I hope that makes sense and seems fair.  If there is a bug in the SDK somewhere then of course you can call Microsoft support about it.  Of course we are here to help you on the forums and blog if you need some help getting started, have questions, or want to discuss a bug.

I’m really looking forward to seeing what people do with this.  We have had some internal groups here at Microsoft take this source code and do this already and it has worked out great.

Download the files from the Microsoft Download site here:

http://www.microsoft.com/downloads/en/details.aspx?FamilyID=65fbe0a3-1928-469f-b941-146d27aa6bac&displaylang=en

You can also download this from MSDN or TechNet if you are a subscriber - the release there is called the 'Service Manager Portal SDK'.

Big THANKS to Paresh (the Microsoft developer on the original portall that helped pull this together), the marketing team for funding the portal style update, and all the community beta testers that validated this release for us!

Enjoy!

Service Manager 2010 SP1 Now Supports Windows Server 2008 R2/Windows 7 SP1 and SQL Server 2008 SP2

$
0
0

Service Manager 2010 SP1 now officially supports Windows Server 2008 R2/Windows 7 SP1 and SQL Server 2008 SP2. 

I guess that’s about all there is to say about that.  :)


Information on the Service Manager Sessions at MMS 2011

$
0
0

LearnAre you going to MMS 2011?  If so then you won’t want o miss all the great System Center Service Manager related sessions we have planned.  There will also be a lot of Hands On Labs and Instructor Led Labs to check out.  You can get all the details straight from Sean Christensen over on the Nexus SC blog.

J.C. Hornbeck | System Center Knowledge Engineer

The App-V Team blog: http://blogs.technet.com/appv/
The WSUS Support Team blog: http://blogs.technet.com/sus/
The SCMDM Support Team blog: http://blogs.technet.com/mdm/
The ConfigMgr Support Team blog: http://blogs.technet.com/configurationmgr/
The SCOM 2007 Support Team blog: http://blogs.technet.com/operationsmgr/
The SCVMM Team blog: http://blogs.technet.com/scvmm/
The MED-V Team blog: http://blogs.technet.com/medv/
The DPM Team blog: http://blogs.technet.com/dpm/
The OOB Support Team blog: http://blogs.technet.com/oob/
The Opalis Team blog: http://blogs.technet.com/opalis
The Service Manager Team blog: http: http://blogs.technet.com/b/servicemanager
The AVIcode Team blog: http: http://blogs.technet.com/b/avicode
The System Center Essentials Team blog: http: http://blogs.technet.com/b/systemcenteressentials

clip_image001 clip_image002

Service Manager Data Warehouse schema now available

$
0
0

Here are the schema diagrams you’ve been waiting for! If you’re not familiar with developing management packs, writing your first few custom queries against the data warehouse can be intimidating. The database schema is based on the common management pack model, which means the relational database objects and relationships benefit from class inheritance, so you should familiarize yourself with the model.

In this blog post I’m going to explain the different types of tables in the data warehouse, which I’ve color coded in the attached schema diagrams. In my next post I’m going to provide a bit deeper knowledge on how to find your way around the views you need and some best practices for using them to write your custom reports.

Click here to download the Service Manager 2010 Data Warehouse schema or keep reading to learn more about how to use it.

A brief overview of the types of tables in the data warehouse

No matter how many tables are in the warehouse, there are only three types of tables. It’s important to understand what each type of table is used for:

1. Dimensions

Dimensions represent the classes, where each row in the dimension is an instance of the class and each column is a property. Enum properties, however, are stored in “outriggers”, which are like dimensions except they have one row per item in a list which describes a class instance. See Outriggers below.

2. Fact tables

Fact tables are the most notable difference between a data warehouse and a transaction processing system. Generally fact tables are used to track transactions, or things that happen, over time. These transactions are usually quantified and summarized, so they get represented as metrics (called measures in data warehousing terms).

In Service Manager 2010, there are two types of fact tables:

a. Relationship fact tables

Relationship fact tables are used to track the relationships between instances of classes over time.

For example, in the Service Manager model there is a relationship called WorkItemAssignedToUser which enables assigning a user to a WorkItem. As the WorkItem is assigned or reassigned to a user, a new row is inserted into the relationship WorkItemAssignedToUser fact table which targets this relationship.

Relationship fact tables also have CreatedDate and DeletedDate columns which enable determining when the relationship was in effect. If the DeletedDate column is null it is currently an active relationship.

All the code required to populate and maintain these fact tables are automatically generated once the fact table is defined in a management pack. More on creating relationship fact tables in this blog post.

b. Custom fact tables

Custom fact tables are fact tables which a developer can write a custom code for and populate based on their specific business requirements.

Out of the box we have a few custom fact tables which can be quite useful. One of them is the IncidentStatusDurationFact. This fact table tracks every time an Incident’s Status changes. The measure in this fact table is the TotalTimeMeasure, which is the duration in minutes which the incident remained in that status.

This enables measuring both the total process time as the Incident proceeds through it’s lifecycle as well as the number of transitions (i.e how many times did the incident move from Active to Pending and how long was it Pending before being reactivated).

For custom fact tables, the Service Manager Data Warehouse infrastructure will automatically generate the code required to extract the data from Service Manager into the warehouse, and to load the data mart from the Repository database, but the transform code must be provided by the developer creating the custom fact table.

3. Outriggers

An outrigger describes an instance of a class. The “Lists” or enumeration properties in Service Manager are used to populate outriggers which describe their respective classes. For example there’s an IncidentClassification outrigger which describes Incidents, a ChangeCategory outrigger which describes Change Requests, a ProblemResolution outrigger which describes Problems and more.

Interpreting and using the Data Warehouse schema diagram

To make the Visio diagram easier to read, I’ve grouped the tables into separate tabs based on subject matter. However, it’s really important to understand that many types of queries will need to span the subject matters. For example, if you want to identify the Configuration Items with the most incidents, you’ll need to join tables from the Incident tab, Work Item tab and Config Item tab. This may not be readily apparent at first, so let’s dig a bit deeper on how this all works.

1. Dimensions

As I mentioned above, each dimensions represent the classes. However, the dimensions help to abstract the complexity of the class hierarchy. For example, there are several different types of “Computer” classes but they are all represented by the Computer dimension. Each dimension has a row for the class it targets and all the classes which extend or derive from that class.

For example, Incidents and Problems are classes which are types of TroubleTickets which are in turn types of WorkItems which in turn is a type of Entity. Each class in this hierarchy could have a dimension. Each dimension in this hierarchy contains a row not only for each instance of it’s class but also each of it’s descendant classes. In the example below:

  • EntityDimvw contains a row for every instance of all classes
  • WorkItemDimvw contains a row for each Incident and Problem (and other work items not reflected below)
  • TroubleTicket doesn’t have it’s own dimension
  • Incident and Problem both have their own dimensions (IncidentDimvw and ProblemDimvw, respectively)

What’s really cool about this is that the EntityDimKey (the surrogate key of the EntityDim) as well as the BaseManagedEntityID is present in each dimension. This enables you to walk up the hierarchy to traverse fact tables which don’t directly join to the dimensions you need.

A classic example of needing to traverse the hierarchy is the WorkItemAssignedtoUserFact. If you want to find out who an Incident is assigned to you can join IncidentDimvw => WorkItemDimvw => WorkItemAssignedtoUserFactvw>UserDimvw:

Select Top 10 incident.Id
, incident.Title
, userdim.UserName as AssignedToUser
From IncidentDim incident
JOIN WorkItemDim workitem on incident.EntityDimKey = workItem.EntityDimKey
JOIN WorkItemAssignedToUserFactvw assignedtouser on workitem.WorkItemDimKey = assignedtouser.WorkItemDimKey
JOIN UserDimvw userdim on assignedtouser.WorkItemAssignedToUser_UserDimKey = userdim.UserDimKey
Where assignedtouser.DeletedDate is null

At first this may seem a bit odd and an unnecessary step. You may be wondering why not simply have an “IncidentAssignedToUserFact” and not worry about this class hierarchy stuff. The truth is, being a model-based data warehouse sometimes makes tables and columns a little harder to figure out at first, but once you understand the usage pattern you can actually see way more uses for the “generic” nature of some of these fact tables. For example, by tracking the history of Work Item assignments in one fact table, you can quickly get a holistic view of every Work Item assigned to a particular user:

Select workitem.Id
, workItem.Title
, mt.TypeName
From WorkItemDim workitem
JOIN WorkItemAssignedToUserFactvw assignedtouser on workitem.WorkItemDimKey = assignedtouser.WorkItemDimKey
JOIN UserDimvw userdim on assignedtouser.WorkItemAssignedToUser_UserDimKey = userdim.UserDimKey
JOIN EntityManagedTypeFactvw entityfact on workItem.EntityDimKey = entityfact.EntityDimKey
JOIN ManagedTypeDim mt on entityfact.ManagedTypeDimKey = mt.ManagedTypeDimKey
WHere assignedtouser.DeletedDate is null
AND userdim.UserName = 'Consetetur Takimata'

You can see this user has Activities, Incidents, and a Change Request Assigned to them.

2. Fact tables

The Relationship fact tables are pretty extensive, but sometimes because they target relationships which are somewhat “generic” it may be hard to visualize how to use them. For example:

  1. The WorkItemAboutConfigItemFact can be used to get the number of work items (ie incidents, change requests or problems) which affect a config item (computer, service, etc)
  2. The WorkItemRelatesToConfigItemFact can be used to get the number of work items (ie incidents, change requests or problems) which relate to a config item (computer, service, etc)

The naming convention of the foreign key columns in the relationship facts help to guide you as to which dimensions to join to.

  • The foreign key which points to the source endpoint of the relationship is named identically to the primary key of the corresponding dimension. For example, in the WorkItemAboutConfigItemFact, the WorkItemDimKey points to the WorkItemDimvw dimension.
  • The foreign key which points to the target endpoint of the relationship is prefixed with the <relationship name>_ then the name of the primary key of the corresponding dimension. In the WorkItemAboutConfigItemFact, the WorkItemAboutConfigItem_ConfigItemDimKey points to the ConfigItemDimvw dimension.

You can “daisy chain” a series of relationship fact tables together to get more indirect relationships for more complex analyses. It’s a bad practice to join fact tables directly to each other, but when they share a common dimension it’s perfectly acceptable to “drill across” from one fact table to another. For example, to determine how many change requests were approved by a user

Select changerequest.ID
, changeRequest.Title
, userdim.UserName
, strings.DisplayName as ReviewerDecision
from ChangeRequestDimvw changerequest
JOIN WorkItemDimvw workitem on changerequest.EntityDimKey = workitem.EntityDimKey
JOIN WorkItemContainsActivityFactvw workitemactivity on workitem.WorkItemDimKey = workitemactivity.WorkItemContainsActivity_ActivityDimKey
JOIN ActivityDimvw activity on workitemactivity.WorkItemContainsActivity_ActivityDimKey = activity.ActivityDimKey
JOIN ReviewActivityDim reviewactivity on activity.EntityDimKey = reviewactivity.EntityDimKey
JOIN ReviewActivityHasReviewerFactvw reviewactivityreviewer on activity.ActivityDimKey = reviewactivityreviewer.ActivityDimKey
JOIN ReviewerDimvw reviewer on reviewactivityreviewer.ReviewActivityHasReviewer_ReviewerDimKey = reviewer.ReviewerDimKey
JOIN ReviewerIsUserFactvw reviewuser on reviewer.ReviewerDimKey = reviewuser.ReviewerDimKey
JOIN UserDimvw userdim on reviewuser.ReviewerIsUser_UserDimKey = userdim.UserDimKey
JOIN ReviewerDecisionvw decision on reviewer.Decision_ReviewerDecisionId = decision.ReviewerDecisionId
JOIN DisplayStringDimvw strings on decision.EnumTypeId = strings.BaseManagedEntityId
WHERE userdim.UserName ='Consequat Vulputate' AND strings.LanguageCode ='ENU'

Notice the last join to DisplayStringDimvw…this enables bringing the localized string for the ReviewerDecision Outrigger into the resultset. There are additional tricks and things to consider when localizing a report and we’ll cover that in more detail in a separate blog post.

3. Outriggers

As I mentioned above, an outrigger describes an instance of a class. Getting a count or list of Incidents, for example, is rarely as useful as filtering or grouping by the Status, Classification or Priority of the Incident. These are a discrete set of known values usually populated in “Lists” or enumerations via the Service Manager console. When an outrigger is populated from an enumeration, it can also be localized and represent a hierarchy.

If you look at the data mart schema, it’s tempting to try to obtain an enum property value from the dimension itself. Yes there’s a column in the IncidentDim called Status, but it’s unfortunately not there for you to use in your reports. We need it for ETL (data processing) purposes. Instead, we provided an outrigger table which understands the hierarchical structure of the Lists in SM. For example, within our Change Management solution, each Change Request can be assigned a Change Area. Each Change Area can roll up into a hierarchy of Change Areas.

You could get the flat list with a query like this one (notice the join to DisplayStringDimvw to get the localized display strings):

SELECT outrigger.ordinal
, Strings.DisplayName AS ChangeArea
, COUNT(*) AS ChangeRequests
FROM ChangeRequestDimvw dim
Join ChangeArea outrigger ON dim.Area_ChangeAreaId = outrigger.ChangeAreaId
Join DisplayStringDimvw Strings ON outrigger.EnumTypeId = Strings.BaseManagedEntityId
WHERE Strings.LanguageCode = 'ENU'
GROUP BY Outrigger.Ordinal, strings.DisplayName
Order by Ordinal

If you’ve taken the time to build out some hierarchies within the list, you could use the additional details in the corresponding outrigger tables to visually represent the hierarchy in your report. One approach is to use a Common Table Expression (CTE) to recursively construct the hierarchy, then order by the Ordinal property to visually represent the hierarchy. We can make this even more visually appealing & useful when we get into custom report authoring, so I’ll leave that for another blog post.

 
WITH ChangeArea_CTE
( ChangeAreaID
, EnumTypeID
, ParentIDPath
, Level
, Ordinal) as
(Select ChangeAreaID
, EnumTypeID
, isnull(convert(varchar, ParentID),'0') as ParentIDPath
, -1 as Level
, Ordinal
FROM ChangeArea
WHERE ParentId is null
UNION ALL
Select changearea.ChangeAreaID
, changearea.EnumTypeID
, Convert(varchar, changearea_CTE.ParentIDPath + '.' + Convert(varchar, changearea.ParentID)) as ParentIDPath
, ChangeArea_CTE.Level + 1
, ChangeArea.Ordinal
FROM ChangeArea
JOIN ChangeArea_CTE on changeArea.ParentId = ChangeArea_CTE.CHangeAreaID
WHERE ChangeArea.ParentId is not null
)
SELECT Replicate(' ', Level) + Strings.DisplayName AS ChangeArea
, isnull(dim.ChangeRequests, 0) as ChangeRequests
FROM ( Select Area_ChangeAreaId
, Count(*) as ChangeRequests
From ChangeRequestDimvw
Group BY Area_ChangeAreaId
) dim
Right Join ChangeArea_CTE outrigger ON dim.Area_ChangeAreaId = outrigger.ChangeAreaId
Join DisplayStringDimvw Strings ON outrigger.EnumTypeId = Strings.BaseManagedEntityId
WHERE Strings.LanguageCode = 'ENU' and Level>=0
Order by Ordinal

Mapping classes to dimensions in the warehouse

There are many classes & relationships whose data is extracted into the Data Warehouse for which there are no dimensions or fact tables out of the box. It’s easy to create a management pack to create your own dimensions and fact tables without writing any code. However, if you’re not sure whether the warehouse already has a dimension you might need, here’s an unsupported query which will walk up the class hierarchy from a specific class (see the filter on line 36) and tell you which dimensions exist which contain rows for that class. Depending on how high up in the hierarchy and how they were modeled, they may or may not contain the columns you need. However, as I mentioned above, walking up the hierarchy is an important part of drilling across fact tables, so this may help guide you in your exploration of the data mart. This query, unlike all the others provided in this post, must be run in the StagingandConfig database:

WITH ClassHierarchy (
ManagedTypeID
, ManagedTypeName
, DerivedManagedTypeID
, BaseManagedTypeId
, DerivedManagedTypeName
, Level
, DiscoveryPath)
as (
SELECT ManagedTypeID as DerivedManagedTypeID
, TypeName as DerivedManagedTypeName
, ManagedTypeID as DerivedManagedTypeID
, BaseManagedTypeId
, TypeName
, 0 as Level
, convert(varchar(max), '>' + TypeName) as DiscoveryPath
From ManagedType

UNION ALL
Select ch.ManagedTypeID
, ch.ManagedTypeName
, mt.ManagedTypeID
, mt.BaseManagedTypeId
, mt.TypeName
, ch.Level + 1 as Level
, convert(varchar(max), ch.DiscoveryPath + '>' + mt.TypeName ) as DiscoveryPath
From ManagedType mt
JOIN ClassHierarchy ch on ch.DerivedManagedTypeId = mt.BaseManagedTypeID
)
Select
ManagedTypeName
, isnull(DiscoveryPath, '>') as RelationshipPath
, DerivedManagedTypeName
, dim.DimensionName
From ClassHierarchy ch
LEFT JOIN Dimension dim on ch.ManagedTypeID = dim.TargetId
Where DerivedManagedTypeName = 'System.WorkItem.Incident'
Order By Level desc

At last, the diagram itself

I hope this post has gotten you warmed up and excited to write some custom queries. In the diagram I’ve color coded each table as a dimension (blue), fact (green) or outrigger (yellow). Here’s an example, but check out the attached Visio for more details.

Click here to download the Service Manager 2010 Data Warehouse schema. An example for the Incident tab is below

Click here to download the Service Manager 2010 Data Warehouse schema

Updating a Change Request when Activities are Updated Using PowerShell in a Workflow

$
0
0

Andreas Rynes, a Microsoft Consulting Services consultant, has come up with a clever solution to a problem.  The problem is how to surface up the contained activity details to the change request level.  This makes it easier to format/send notifications and such things.  He wrote a guest blog post about his solution which uses the SMLets PowerShell module (a CodePlex solution) to get the activity and update the parent change request description.

Thanks for figuring this out and sharing it Andreas!

====================================================

During my work with Service Manager I’ve got a requirement that data from custom activities should be update to the main form of the change request. So in the case of a change request template that has multiple activities and all are having some custom fields, every time a single activity from a change request is updated with some data, the complete list of activity data (not just the one that got updated, also all the other activities from the same change request) should be written to the description field of the parent change request or should be updated if it already exists. Manually entered descriptions from users should be kept and not overwritten by that solution.

The overall idea is to implement that in a generic way, so that even custom activity classes that are implemented in the future are working with that solution as well.

This is how it might look, after you’ve implemented my solution:

clip_image002

Seems not possible with the out of the box features of SCSM, but it is possible with a custom workflow and Windows Powershell. Thanks to Jim Truher, there is a Service Manager Cmdlet for Powershell and thanks to the Product itself there is an easy way to create a custom workflow with a Powershell script.

1. Custom Activity

So I don’t tell you in this blog post, how to create a custom activity, but there is an excellent post about that from Travis Wright and Jim Pitts: http://blogs.technet.com/b/servicemanager/archive/2010/11/16/how-to-automate-vm-provisioning-in-20-minutes-using-service-manager-and-opalis.aspx

So after you’ve created a similar sample of a custom activity that has some custom attributes and a custom form to enter the data and you’ve imported that to SCSM, you can create an activity template and a change request template that has this new custom activity defined. (You’ll find both templates in the MP attached to this blog post)

2. Custom Workflow in the SCSM Authoring Tool

Now it’s time to create the custom workflow, go to Service Manager Authoring Tool, create a new MP and a new workflow, give it a name and enter some description in the wizard.

On the next page you’ve been asked to provide the trigger (time based or database based). This time we need to specify “Run only when a database object meets specified conditions” as we like to trigger that workflow when an activity is updated/changed.

On the next page click the Browse button and choose Activity, that means that this workflow triggers whenever an class that inherited from Activity is updated (so in our case we’ve a custom activity that inherits from ManualActivity, which inherits from the base class Activity).

Choose “When an object of the selected class is updated” in the Change Event box. You could possible define additional criteria, but this time we don’t do that (as this should be a generic solution). Then you’re done, click on Next, Create and Close button.

clip_image003

After that you’ll find the new workflow completely empty in the main pane in the middle and you’re able to add a new shape to the area saying “Drop activities to create a Sequential Workflow”.

Drag and drop a “Windows Powershell Script” shape to the workflow and change the name to something that makes sense for your sample.

clip_image005

This is the result at that stage:

clip_image006

3. Powershell Script and Parameter

As long as you see the red exclamation mark in the corner of a shape, there is some information missing. In this case the script body and its parameter are not defined yet, so that makes the Powershell worthless for now, so we are going to define those values:

clip_image008

The script body:

set-executionpolicy -executionPolicy ByPass

$a = (get-module|%{$_.name}) -join " "

if(!$a.Contains("SMLets")){Import-Module SMLets -ErrorVariable err -Force}

This part defines the execution policy for the Powershell session. For that sample I’ve used ByPass, which allows to run everything, which might not be perfect and you should consider digital signing the cmdlets you are using. Then I’m going to get all loaded modules (get-module), check if the SMLets module is already loaded (Contains) and load it if not done before. The –ErrorVariable is used for troubleshooting only, so if you’re not going to use the variable you can skip it. I’ll describe troubleshooting techniques for that in an separate post later.

$indexend = (Get-SCSMRelationshipObject -Target (get-SCSMClass System.WorkItem.Activity$)|?{$_.TargetObject -match $activity_id}).SourceObject.ToString().IndexOf(':');

$cr_id = (Get-SCSMRelationshipObject -Target (get-SCSMClass System.WorkItem.Activity$)|?{$_.TargetObject -match $activity_id}).SourceObject.ToString().Substring(0,$indexend);

Then I’m using the SMCmdlet for the first time to get the parent CR from the activity. The activity id is passed into the script (we’ll define that later!) and with Get-SCSMRelationshipObject I’ll get the parent change request of the activity parameter.

$activities = Get-SCSMRelatedObject -SMObject (get-scsmobject (get-scsmclass System.workitem.ChangeRequest$)|?{$_.Id -eq $cr_id}) -Relationship (Get-SCSMRelationshipClass System.WorkItemContainsActivity)

$ActivityDict = @{}

$countKeys = 0

foreach($activity in $activities)

{

foreach($val in $activity.Values)

{

if($val.Type -match "^z.*")

{

$countKeys++

$ActivityDict[$activity.id+"@"+$val.Type] = $val.Value

}

}

}

This parts grabs all the related activities from the parent change request using the SCSMRelatedObject and the SCSMRelationshipClass “WorkItemContainsActivity”. For each activity I’m then look up for values from fields that are name with a starting z and adding those into a dictionary and count them (for later use).

$description_start = "-------------------------------------------------" + "`n" + "-- Activity Data Begin (Do not change!) --" + "`n" + "-------------------------------------------------" + "`n"

$description_ende = "------------------------------------------------" + "`n" + "-- Activity Data End (Do not change!) --" + "`n" + "------------------------------------------------" + "`n"

$desc_beginindex= (Get-SCSMObject (get-SCSMClass System.WorkItem.ChangeRequest$) -Filter "Id -eq $cr_id").Description.ToString().IndexOf("Activity Data Begin");

$desc_endeindex= (Get-SCSMObject (get-SCSMClass System.WorkItem.ChangeRequest$) -Filter "Id -eq $cr_id").Description.ToString().IndexOf("Activity Data End");

if($desc_beginindex -eq -1) {$desc_beginindex = 0 } else {$desc_beginindex -= 53}

if($desc_endeindex -eq -1) {$desc_endeindex = 0} else {$desc_endeindex += 87}

So then it’s the hard part of the solution J Building the data string in a nice and readable way, which is also easy to recognize to make it updateable. So with the method IndexOf we look for an existing activity data block. If there is none we set the begin and end index to 0 otherwise we set it to the beginning and end of the data block.

$description = (Get-SCSMObject (get-SCSMClass System.WorkItem.ChangeRequest$) -Filter "Id -eq $cr_id").Description.ToString()

$descriptionnew = $description_start

foreach($a in $ActivityDict.Keys)

{

$descriptionnew += $a.Substring(0,$a.IndexOf('@')) + " / " + $a.Substring($a.IndexOf('@')+1,$a.Length-$a.IndexOf('@')-1) + ": " + $ActivityDict[$a].ToString() + "`n"

}

$descriptionnew += $description_ende

$descriptionnew += $description.Substring(0,$desc_beginindex) + $description.Substring($desc_endeindex,$description.Length-$desc_endeindex)

Now we grab description that is already in the change request with Get-SCSMObject and filter the output by the change request ID. The goal is to keep that data in the description field and not to overwrite it with the new values. Then we iterate the collection and build up the new description, the first part is the activity data then we add the existing data (except the activity data that was already there)

if($countKeys -gt 0){

Set-SCSMObject -SMObject (Get-SCSMObject (get-SCSMClass System.WorkItem.ChangeRequest$) -Filter "Id -eq $cr_id") -Property 'Description' -Value $descriptionnew

}

remove-module -name SMLets –force

The count of the custom fields, where the name of the field starts with z defines if the new description is written to the change request or not. If there are no custom fields, nothing is changed in the change request. So we are using the $count variable and then use Set-SCSMObject to set the description field. At the end we remove the SMLets module, so that we can Import it again the next time (If you don’t remove it, it might be a problem importing the module again)

Now it’s time to define the only input parameter for the script. Click on the “Script Properties” tab and enter activity_id in the name field (don’t use the $ here, just in the script you need to reference to $activity_id)

clip_image010

In the value choose a class property and choose the ID of the activity class. This is the only input parameter that we need for that solution.

clip_image012

4. Deployment

Now you’re done with that and save the solution in the Authoring Tool. After that a dll is generated in the Authoring Tool location. This dll has the same name as your MP and must be copied to the Service Manager installation location and the Management Pack needs to be imported in the SCSM console as well.

The last thing that needs to be done is the download and installation of the Service Manager Cmdlet from that location: http://smlets.codeplex.com/

Please be aware of unblock the SMLets archive download before installing it (see documentation at http://smlets.codeplex.com/documentation)

Copy the content as described to the powershell default location c:\windows\system32\WindowsPowerShell\v1.0\Modules\SMLets

In the attachment you’ll find the Management Pack that contains a custom activity, a custom form for that activity and the templates for the activity and change request and the custom workflow of course.

Create a report model with localized outriggers (aka “Lists”)

$
0
0

If you've watched my Reporting and Business Intelligence with Service Manager 2010 webcast and followed along in your environment, you may have unintentionally created a report which displays enumeration guids instead of Incident Classification strings, like below. Not too useful. In this post I'll tell you the simple way to fix your report model to include the display strings for outriggers for a specific language, and in a follow on post I'll share more details as to how to localize your reports and report models.

You may be wondering what happened. This is because we made a change in SP1 to consistently handle outrigger values which removed the special handling we had for our out of the box enumerations in outriggers. If you're now wondering what outriggers are, read up on the types of tables in data warehouse in my last post in which I provided the service manager data warehouse schema.

Here's the screenshot of the report we need to fix, the rest of the post will explain how to fix it.

 

Replace table binding in the Data Source view with Query binding

Rather than including references to the outriggers directly (in the screenshot below the outriggers are IncidentClassificationvw, IncidentSourcevw, IncidentUrgencyvw, and IncidentStatusvw) we'll replace these with named queries.

To do this, you simply right click the "table" and select Replace Table > With New Named Query.

 

You then paste in your query which joins to DisplayStringDimvw and filter on the language of your choice. Repeat for each outrigger.

SELECT outrigger.IncidentClassificationId, Strings.DisplayName AS Classification

FROM IncidentClassificationvw AS outrigger INNER JOIN

DisplayStringDimvw AS Strings ON outrigger.EnumTypeId = Strings.BaseManagedEntityId

WHERE (Strings.LanguageCode = 'ENU')

 

Create & publish your report model

To create a simple report model, right click the Report Models node in the Solution Explorer (right pane) and select Add New Report Model. Follow the wizard, selecting the default options.

 

If you want to clean it up a little, double click the Report Model, then select IncidentDim on the left.

Scroll down the properties in the center and you'll notice there is now a Role added to the IncidentDim named Classification Incident Classification, along with an Attribute named Classification. This is because using outriggers to describe dimensions is an industry standard approach and SQL BI Dev Studio understands that these outriggers should essentially get added as properties directly to the Incident dimension for the easiest end user report authoring experience.

The attribute is populated directly by the column I mentioned you should not use in reports, so you should select and delete that attribute from your model. You may also rename the Role "Classification Incident Classification" to a more user-friendly name like "Incident Classification" if you'd like to.

 

Now save, right click your report model and click Deploy.

Create a report to try out your new report model

Open up SQL Reporting Services Report Builder (below screenshots are using Report Builder 3.0). If you haven't gotten a chance to check it out yet, here's a good jump start guide.

 

Follow the wizard, select your newly published report model:

 

Drag & drop your Incident Classification and Incidents measure. Hit the red ! to preview.

 

Drag & drop to layout the report

 

Continue with the wizard, selecting the formatting options of your choice. If you would like, you can then resize the columns, add images and more. For our quick and simple example, though, I'm going to intentionally leave formatting reports for another post. If you've been following along, your report should now look like this:

 

Go ahead and publish to the SSRS server under the /SystemCenter/ServiceManager/ folder of your choice to make the report show up in the console.

 

 

New Community Opalis Integration Pack for Service Manager

$
0
0

Marcel Zehner over at SCSMFAQ.ch has recently built and made available a new Opalis integration pack for SCSM!  For now it includes three new workflow activities:

  • Auto-close Resolved Incidents
  • Auto-close Resolved Problem
  • Export Unsealed Management Packs

He is looking for feedback and suggestions on what else to include.

More details and download here:

http://blog.scsmfaq.ch/2011/03/29/opalis-integration-pack-extension-for-service-manager/

Nice work Marcel!  Looking forward to seeing what else you come up with!

Like our System Center blogs? Now you have an app for that

$
0
0

imageEver since Windows Phone 7 was launched late last year there’s been a growing buzz surrounding Microsoft’s new phone OS, and with the recent announcement that the Windows Phone Marketplace had surpassed 10,000 apps in those 4-5 months and was growing fast (now well over 12,000) I took a look said “Hey, why not me too.”  And with that The System Center Blog Aggregator was born. 

So why make an app that simply feeds the System Center blog content to your phone when any old newsreader can be made to do the same?  Well, I had a couple reasons.  First, feedback suggested that many of our readers are IT pros who specialize in System Center and don’t necessarily follow a lot of other sites on a regular basis so they have no need to load up and configure a full blown newsreader just to stay up on our content.  In addition to that, I was curious how the process worked and how hard it would be to create an app and get it published.  Even though I work for Microsoft, the whole process was kind of a mystery to me. 

Then a couple weeks ago I saw that AppMakr (no affiliation) announced their support for Windows Phone and I knew I could use that to easily whip up a cool content aggregator that customers as well as other Microsoft employees out in the field could use to keep up with the latest in the world of System Center.  So long story short, a mere 48 hours later I had my app completed, tested, certified by Microsoft and published in the Marketplace.

Here are a couple screen shots to give you an idea of what it looks like.  After you load and start The System Center Blog Aggregator you see the main screen that has our blogs listed in alphabetical order:

BlogAppScreenShot1

From there you simply select the blog you’re interested in and all the latest posts are viewable:

BlogAppScreenShot2

From there you can select and read any post you like.  So there you have it, an app that allows you to follow our content quickly and easily right on your snazzy new Windows Phone.  If you have a WinPhone you should load up The System Center Blog Aggregator and check it out.  It’s free after all so what do you have to lose?  The Zune link is http://social.zune.net/redirect?type=phoneApp&id=fe9b6ff1-8355-e011-854c-00237de2db9e but if you want to load it directly from your phone just search on “System Center” in the Marketplace and you’ll find it. 

Oh, and what about other platforms besides Windows Phone?  I should have a version for Android some time next week and as soon as it’s available I’ll let you know.  As of now I don’t really have any plans on doing an iOS version due to the costs but if it turns out there’s any kind of demand for it I’ll definitely revisit it later on down the road.

Enjoy!

J.C. Hornbeck | System Center Knowledge Engineer

The App-V Team blog: http://blogs.technet.com/appv/
The WSUS Support Team blog: http://blogs.technet.com/sus/
The SCMDM Support Team blog: http://blogs.technet.com/mdm/
The ConfigMgr Support Team blog: http://blogs.technet.com/configurationmgr/
The SCOM 2007 Support Team blog: http://blogs.technet.com/operationsmgr/
The SCVMM Team blog: http://blogs.technet.com/scvmm/
The MED-V Team blog: http://blogs.technet.com/medv/
The DPM Team blog: http://blogs.technet.com/dpm/
The OOB Support Team blog: http://blogs.technet.com/oob/
The Opalis Team blog: http://blogs.technet.com/opalis
The Service Manager Team blog: http: http://blogs.technet.com/b/servicemanager
The AVIcode Team blog: http: http://blogs.technet.com/b/avicode
The System Center Essentials Team blog: http: http://blogs.technet.com/b/systemcenteressentials
The Server App-V Team blog: http: http://blogs.technet.com/b/serverappv

clip_image001 clip_image002

Properly Querying SCSM Using SMLets Get-SCSMObject cmdlet

$
0
0

Jim Truher and I have been seeing some usage of the SMLets Get-SCSMObject cmdlet lately that will cause really bad performance when run in a production environment.  Things will work fine in a dev/test environment where the quantity of data is small but as soon as you put it into a fully loaded production environment it will cause massive problems.  This same kind of bad programming could be done in .NET code, but I never see it happen there.  I think for some reason it is just easier to sort of not think through what is really happening when you are writing scripts with the all powerful pipe in PowerShell.

Here is the bad example:

$Resolved = Get-SCSMEnumeration IncidentStatusEnum.Resolved$

Get-SCSMObject –ClassName System.WorkItem.Incident$ | Where-Object{$_.Status -ne $Resolved}

Can you see where the bad part is?

The bad part here is that Get-SCSMObject in this case is going to get every incident in the database and bring it back to the management server (or wherever this command is being run).  Then Where-Object is going to loop through all of those comparing the Status property to see if it is Resolved.  This works great on a dev or test system that just has a few incidents in it.  In a production system with 150,000 incidents, this is going to be very slow and make the processor work very hard on the management server.

This really should be done by leveraging the power of SQL Server to just get the incidents we care about.  There is a –Filter parameter on the Get-SCSMObject cmdlet that can be used for these purposes.  Rewriting the above using –Filter would look like this:

$ResolvedId = (Get-SCSMEnumeration IncidentStatusEnum.Resolved$).Id

Get-SCSMObject –ClassName System.WorkItem.Incident$ –Filter “Status –ne $ResolvedId”

Not only is that less typing but it is also vastly better performing!

Right now the –Filter parameter only works with a single property as criteria.  In the short term, you can use –Criteria as a way to handle multiple properties as criteria.

Bad Example:

$Now = (get-date).addhours(4)

$Closed = Get-SCSMEnumeration IncidentStatusEnum.Closed$

$Resolved = Get-SCSMEnumeration IncidentStatusEnum.Resolved$

Get-SCSMObject -ClassName ClassExtension_5cf048c3_5db1_4873_afb6_f85a7bfc8d3e | Where-Object{$_.Status -ne $Closed -and $_.Status -ne $Resolved -and $_.TargetResolutionTime -lt $Now -and $_.TargetResolutionTime -ne $Null}

 

Good Example:

$Now = (get-date).addhours(4)

$CId = (Get-SCSMEnumeration IncidentStatusEnum.Closed$).id

$RId = (Get-SCSMEnumeration IncidentStatusEnum.Resolved$).id

$Class = ClassExtension_5cf048c3_5db1_4873_afb6_f85a7bfc8d3e

$cType = "Microsoft.EnterpriseManagement.Common.EnterpriseManagementObjectCriteria"

$cString = "Status != '$CId' and Status != '$RId' and TargetResolutionTime < '$Now' and TargetResolutionTime Is Not Null"

$crit = new-object $cType $cString,$Class

Get-SCSMObject -criteria $crit

 

That’s a little harder than we would like it to be though so Jim is working on an improvement to the –Filter for SMLets that will allow you to specify more advanced filter criteria something like this:

Get-SCSMObject -Class $c -Filter "FirstName -eq 'Travis' -and LastName = ‘Wright' -and homephone -like '467*'"


SCSM Deep Dive Summit

$
0
0

Back in February we hosted a Service Manager Deep Dive Summit where the Program Managers from the product team presented on various topics in depth:

  • Incident and Problem Management – Ketan Ghelani
  • Change Management  - Vladimir Bakhmetyev
  • CMDB & Connectors – Marc Umeno
  • Reporting and DW Administration – Chris Lauren
  • Customizations – Travis Wright

All of these sessions were recorded and are posted up on the Web now.  You can watch all of them on Sean Christensen’s Vimeo site or they are embedded below for your viewing enjoyment.

Incident and Problem Management

Change Management

CMDB & Connectors

Reporting and DW Administration

Customizations

Service Manager on New TechNet Gallery (Beta)

$
0
0

A new TechNet Gallery site has been set up (in beta for now) to enable people in the community to more easily share solutions with each other.  Right now we have just one contribution from Steve Beaumont up there to backup your unsealed MPs, but I’m looking forward to seeing what else people from the community can contribute.

Service Manager on TechNet Gallery

FAQ:

What is the different between the TechNet Gallery and CodePlex? CodePlex has support for source code control, versioning, collaboration, and other features.  The TechNet gallery for SCSM makes it easier to find solutions specifically for SCSM instead of wading through all the projects on CodePlex.  So – think CodePlex for advanced code-based projects that you want to collaborate with other people on.  Think TechNet gallery for quick solutions that you want to be able to easily share with others.

SCSM Roadmap Update at MMS 2011

$
0
0

We’ll I am finally starting to get caught up from MMS 2011!  It was great to see many of you there at the conference!  I wish we could have MMS every month (especially if I could just go and not have to present)!

I was going to write up a blog post on all the new updates on SCSM that were shared at MMS, but Anders Asp has already done a great job of that on his blog.  Since I am lazy, I’ll just point you over there instead of writing up essentially the same thing.  That way I’ll have more time to write other blog posts.

Before I do that though I do want to point out a couple of key updates on the SCSM roadmap that were announced at MMS 2011:

  • SCSM 2010 “R2” has been renamed to SCSM 2012.  Two reasons for this change: 1) The scope of SCSM 2012 has increased and therefore is more than just an “R2” type of release and 2) the name lines up with the names of the other System Center products in the suite that will be released at roughly the same time as  SCSM 2012 – Operations Manager 2012, Configuration Manager 2012, Virtual Machine Manager 2012, Data Protection Manager 2012, and Orchestrator 2012 (new name for Opalis)
  • In addition to what we announced at TechEd Europe back in Nov 2011 we also now plan to have a connector to Orchestrator 2012 to bring Orchestrator “runbooks” into SCSM.  From there you can easily include Orchestrator workflows as automated activities inside of change requests, service requests, and release records.  When those automated activities become active in the process of fulfilling those requests, SCSM will trigger the workflow directly in Orchestrator and monitor Orchestrator for completion of the automation.  This tight integration between SCSM and Orchestrator will open up even more possibilities for automation and will ensure that these automated steps are automatically monitored and controlled directly from inside of SCSM.
  • Prior to MMS 2011, we hadn’t really gone into much detail about what we meant by a “service catalog” and “service request fulfillment”.  Combined with the deep Orchestrator integration, this area of the product has really turned into the showcase feature area of the product that people are most excited about.  The new self-service portal will be built on SharePoint 2010 using Silverlight based web parts.  The entire portal and each of the web parts will be highly customizable using familiar SharePoint administration interfaces.  The request forms on the service catalog will be configuration driven and mapped to work item templates for easy configuration.
  • The schedule has remained unchanged with a beta planned for Q3 and RTM for end of year.

With that, please check out Anders’ blog posts (including lots of screenshots) on SCSM 2012 at MMS 2011:

Part 1: http://www.scsm.se/?p=297

Part 2: http://www.scsm.se/?p=342

And here is a blog post from Kurt VanHoecke too:

http://scug.be/blogs/scsm/archive/2011/03/30/notes-from-mms-service-manager-2012-overview.aspx

And here is an interview that our product manager did at MMS on SCSM 2012:

http://technet.microsoft.com/en-us/edge/sean-christensen-discusses-how-service-manager-2012-offers-easy-customization-of-processes-and-decisions

I wish you all could have been there, but hopefully this helps to share the excitement with you about what is coming in SCSM 2012!

FAQ: Why Can’t I Add Some Columns That I Want to Views?

$
0
0

People sometimes ask why they can’t include some columns in the views that they want.  Here is a common one:  Why can’t I add the affected configuration item(s) as a column in my incident view?

In order to understand the answer to this question we must understand the incident data model.  An “incident” record is really comprised of the following things:

  • The incident object itself including all of its properties
  • Title
  • Description
  • Urgency
  • Impact
  • etc.
  • Relationships to other objects over relationship types with max cardinality = 1 (meaning there can only be at most one related object)
  • Assigned To User
  • Primary Owner
  • Affected User
  • Relationships to other objects over relationship types with max cardinality > 1 (meaning there can be more than one related object)
  • Affected configuration items
  • Affected services
  • Knowledge articles

See this blog post for more information on using type projections in views.

If you want to display properties of the incident object itself, that is pretty straightforward.  The columns would simply just display the values of the properties.  For example:

ID

Title

IR12345 Printer is broken
IR12346 HRWeb is down

It’s also pretty straightforward to add columns for related objects where the relationship type max cardinality is 1.  For example, let’s say we wanted to show the Affected User’s domain and user name.

ID

Title

Affected User Domain Affected User Username
IR12345 Printer is broken redmond twright
IR12346 HRWeb is down redmond billg

We run into some problems when we want to display properties of objects where the relationship type max cardinality is more than 1 though.  For example, let’s say we wanted to display the affected configuration items’ display names. If there was only one related configuration item we could simply just display it like this

ID

Title

Affected User Domain Affected User Username Affected CI Display Name
IR12345 Printer is broken redmond twright twright-laptop
IR12346 HRWeb is down redmond billg hrweb01

But what do we do when there is more than one affected configuration item?

We could do something like this:

ID

Title

Affected User Domain Affected User Username Affected CI Display Name
IR12345 Printer is broken redmond twright twright-laptop; printserver01
IR12346 HRWeb is down redmond billg hrweb01; hrweb02; hrweb03

But that is kind of hard to read.  It’s also technically challenging given the way the grid data binding works.  This also makes it kind of hard to sort or group by the affected CI Display name column.  And it would slow down performance.

We could do something like this:

ID

Title

Affected User Domain Affected User Username Affected CI Display Name
IR12345 Printer is broken redmond twright
twright-laptop
printserver01
IR12346 HRWeb is down redmond billg
hrweb01
hrweb02
hrweb03

This approach is also hard to read and doesn’t use space efficiently on the grid and would get really unusable when there were potentially tens or hundreds of related items.

So, at least for now, you can only use properties of objects that are related by relationship types with max cardinality = 1 as columns in views.  That is why you see the Affected User, Assigned To User, etc. relationships in the view creation dialog and not Affected CI.

image

A few other notes on this… You can see above that the Primary Owner relationship is not included in the list even though it is a max cardinality = 1 relationship type.  That’s because this view is targeted at a type projection which only includes those two relationship types.  If I wanted to include the Primary Owner column in a view I would need to target the view at a type projection which includes that relationship type.  For example if I target the view at the ‘Incident (advanced)’ type projection (Warning!!) then I will see lots of other relationship types available to me (but they are still all only max cardinality = 1 relationship types!):

image

Max Cardinality > 1 relationship types can still be used in the criteria part of the view definition though – as long as the relationship type is included in the type projection.  For example:

image

Notice how max cardinality > 1 relationship types like ‘Is Related to Configuration Item’ and ‘About Configuration Item’ are included here so you can use them as criteria for the view.

So – how do you know if a relationship type is max cardinality = 1 or > 1?  The easiest way is just to look at the fields on the form that are used for those relationships.  For example – assigned to user, affected user, primary owner, etc. all use the user picker which only allows for the selection of a single user:

image

So – this is a max cardinality = 1 relationship type.

Other relationship types use a list view control like this:

image

Since you can have more than one item here this is a max cardinality > 1 relationship type.

Since an action log can have multiple entries in it this is also a max cardinality > 1 relationship type:

image

Hope that helps make it more clear why you can’t add some of the columns you might want.

FAQ: Why Don’t Work Item IDs Increment Uniformly?

$
0
0

Let’s say you create an incident – it has an ID of IR210:

image

2 minutes later you create another incident and it has an ID of IR262:

image

You know that there couldn’t possibly have been 51 incidents created in the last 2 minutes because you are the only person working late at night tonight.  Ever seen this happen before?  What is going on??

Here is how the work item ID field works….

  • First of all, it is important to know that the incident ID is an inherited property from the Work Item class.  All of the work item classes like change request, problem, activity (and in SCSM 2012 – release record, service request) all inherit this property. 
  • The ID property is the key property for all of the work item classes meaning that it must be unique for each work item.  It is not possible to have two incident with ID = IR210.  It is also not possible to even have a CR and an incident with the same ID.
  • The ID property is an auto-incrementing property.  It is formatted as <configuraable prefix>{0} where the {0} is replaced with an automatically incrementing integer and the <configurable prefix> is configurable for each work item class.  You can even have no prefix value if you want to.

What this means is that the work item ID increments for every work item that is created regardless of what class of work item is being created.  Imagine this scenario…

  1. Incident 210 is created.
  2. Change request is created – it gets work item ID 211.
  3. The change request contains 5 activity work items – those are IDs 212, 213, 214, 215, 216.
  4. Another incident is created – its ID will be 217.

It is also important to know that if a work item form is opened the work item ID is “allocated” for the creation of that work item – even if the work item is not actually created.  Thus this scenario is possible.

  1. Incident 218 is created.
  2. A user (User A) opens a incident form to create a new incident.  ID 219 is allocated.
  3. A different user (User B) opens an incident form to create a new incident.  ID 220 is allocated.

If User A cancels out and never actually submits incident 219 and User B does submit incident 220 then you will the following incidents:

  • Incident 218
  • Incident 220

There will never be a incident 219.  Or any work item with ID = 219.

Why did we do it this way – having a work item ID shared across all work items?

Having a common work item ID makes it easier to find the work item that somebody is talking about because you don’t need to clarify the work item class.  Imagine this dialog:

Bob: “Hey look up number 1234”

Tom: “Are you talking about an incident or a problem?”

Tom doesn’t need clarification on the work item class because the ID is unique across all work item types.  He can just type it into the search bar and get the work item immediately.

It looks a little strange to see non-sequentially ordered work item IDs at first, but after awhile in a production system with tens of thousands of incidents in it, you wont even notice any more.

Why do we allocate work item IDs even if they are not used?

This was an interesting problem actually.  These were the requirements from customers:

  1. Automatically increment the number.
  2. Display the number in the incident form before the work item is actually created so that the analyst can refer to the number when talking to the customer.
  3. Keep the sequence of numbers in order chronologically

If we allocated the numbers at the time the the incident was saved we would  have a perfect sequence (assuming other work items were not created in the meantime) but it would not be possible to display the ID when the user first opens the form (violating requirement #2).

If we allocate the number at the time the form is shown then we satisfy all three requirements, but we don’t end up with a perfect sequence.  Since we are not going to end up with a perfect sequence due to the work item ID being shared across all work item types anyway, we felt this was less important.

If we went back and used numbers which will allocated but never used we would violate requirement #3.

We discussed this design with quite a few customers and given all the tradeoffs this seemed to meet the most important requirements.

It’s just a little odd if you don’t know the story behind it.  I hope this helps clear it up.

Viewing all 562 articles
Browse latest View live