Monitor an agent - but run response on a Management Server
Comments
- Anonymous
November 08, 2018
Adding back previous comments:Raphael BurriNovember 3, 2015 at 9:08 am (Edit)Hi KevinThank you for writing this up as a reference. I have been using this now and again for the last years and it is a great enabler for complex workflow scenarios.However; I came to understand that it is often better to use the more specialized target class Microsoft.SystemCenter.CollectionManagementServer. Such the response will run on a "true" Management Server with database and SDK capabilities. This allows usingPowerShell scripts (or modules) that require the SCOM SDK.Using the Microsoft.SystemCenter.ManagementServer target class will include Gateways. Those do not allow access to SDK and/or DB. Hence advanced rule actions may fail wen running for agents connected through gateways.When using Microsoft.SystemCenter.CollectionManagementServer instead, the rule action will be executed on the MS that is currently serving the GW to which an agent is connected. More versatile in my opinion.The other remark when using this great rule re-targeting: One has to be careful with variables. the usual $Target$ replacement will show unexpected results when used on the rule action. This is because not the triggering agent’s target object properties arebeing evaluated but the MS’ properties it has been redirected to. If you need to know e.g. which agent the rule was triggered on, one possible workaround includes:Include the agent’s name (or other properties) as parameters in the event you’re triggering the rule on. Then use $Data$ replacement when calling the action sctript. E.g.: $Data/Context/DataItem/Params/Param[3]$ (getting the 3rd parameter from the collectedevent coming from a consolidated rule).Raphael- Anonymous
November 08, 2018
The comment has been removed
- Anonymous
- Anonymous
November 09, 2018
Hey Kevin, I know this is unrelated to this post, but I couldn't find a better way to ask you. I'm semi-new to SCOM (though long time with SCCM) and I want to give users access to the console for their machines. I used your AD group MP fragment and created groups with their machines (which was a great help, thank you), but when I set that as their scope and give them access to some of the views (SQL Team doesn't need the AD MP views cluttering up their screen) many of the objects are missing (SQL team can see their computers, but the DB instances and Availability Groups don't show, for instance). How do I create a group that contains all objects contained by their SQL systems without explicitly listing the 75+ SQL object types in the dynamic query builder? - Anonymous
November 11, 2018
Hello Kevin. first thanks for another great article!second, I have two questions regarding this post:1. Can I use this technique for disabling a monitor, from a scom agent that's in an untrusted domain that (connects to scom via a gateway server)? cause currently I'm doing it with a recovery task that use a remote PowerShell script to disable the required monitors.2. I can see that lately you are shifted from using the Windows Computer target class to the Windows Server Operating System class instead? can you please explain why?- Anonymous
November 11, 2018
- You should be able to... I don't see why not.2. I NEVER target "Windows Computer" with very few exceptions. Windows Computer is a special class and there can be many unhosted instances of this class.I almost always target "Microsoft.Windows.Server.OperatingSystem" for generic workflows and discoveries, again, with only a few exceptions. I will target "Windows Server" class sometimes, IF I need a Windows Failover Cluster aware workflow, to be able to use the "IsVirtual" property in the discovery. This is incredibly rare.I generally like to target specific application classes for workflows, unless I need to run the workflow on a wide audience. In that case I prefer Windows Server Operating System.
- Anonymous
November 11, 2018
(The content was deleted per user request)
- Anonymous
- Anonymous
- Anonymous
November 22, 2018
What kind of permissions does the script need? My script doesn't seem to be able to run the Commandlets I've given it to put a server into MM (the script works against an MS when run directly on the MS) but does complete the last part of my script where it outputs to a file. I'm wondering if the script needs to have a RunAs account that is Admin on the MS? Or to the SDK?- Anonymous
November 22, 2018
This is why ALL scripts should collect and log an event starting script, which outputs who the script is running under.https://blogs.technet.microsoft.com/kevinholman/2017/09/10/demo-scom-script-template/Then you know who the script is running as. By default - the scripts should be executing as the Management Server Action account, when it is running on a MS, which should have local admin permissions.
- Anonymous
- Anonymous
November 27, 2018
Hello,I am looking to do something similar to this, i want to have a monitor on a server that runs a script from the management server and the result of that script be reflected in the health of the monitor. Do you have any suggestions on how to proceed.Thank You