Crawl time Item level security trimming for External System in SharePoint 2010
Unlike MOSS 2007 where you can only do the Query time security trimming on BDC Entities (https://msdn.microsoft.com/en-us/library/aa980904(v=office.12).aspx) , In SPS 2010 you can implement the security trimming in Crawl time itself (https://msdn.microsoft.com/en-us/library/gg294169.aspx). This article is missing with the sample for implementing the item level security through .NET connector.
When it comes to trimming search results on BCS external system, there are two approaches : You can trim during CRAWL time or during QUERY time.
QUERY time security trimming :
Trimming search results at query time requires that the permission for the current user be checked against every item returned from the search query (Crawled URL) after the query is complete. This is potentially a very expensive process, but it may be required because of the external system or because SharePoint is using something other than active directory for security. If you are using NTLM users for granting permissions for individual item in external system then the recommended is to go for CRAWL time security trimming as Query time security trimming is expensive process. For CRAWL time security trimming all you need to include is a column to your external system table which will carry the permission information for the item (ROW). If you have huge external system which is already built and if you feel including a column with permission populated in it would be a tedious work then you can opt for query time security trimming.
CRAWL time security trimming :
Trimming search results in CRAWL time is preferred approach, because you can build up an Access Control List (ACL) during the crawl that becomes part of the Search Index. Either you can have the set of security principal stored in the external system (in BYTE Array format) for each item and fetch it using BDC model or you can store the user information in external system(in Domain\username String format) for each item and fetch it as a Byte array through BDC .NET Assembly connector. The reason to fetch the security principal information in Byte Array is because BCS uses in built “WindowsSecurityDescriptorField” property which expects Byte Array to be returned.
Implementing a CRAWL time security trimming in the BDC model which directly connects with external data source is quite easy. All you need to do is to include is the “WindowsSecurityDescriptorField” property to the specific finder method instance in your BDC model and need to map the external system column which carries the ACL information. (The ACL information in the back end external system need to be stored as BYTE array data type). Following is the sample specific finder method instance of BDCM looks like with CRAWL time security trimming. NOTE : If you are adding any new set of permission then you need run the FULL CRAWL again since the ACL will be built during the Crawl and will be part of the index
The highlighted values below tells you the methodology explained here :
<Method Name="Item SpecificFinder ">
<Properties>
<Property Name="RdbCommandType" Type="System.Data.CommandType, System.Data,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089">Text</Property>
<Property Name="RdbCommandText" Type="System.String">
SELECT [Identifier] ,
[SecurityDescriptor] FROM [Test].[dbo].[Items] WHERE [Identifier] = @Identifier
</Property>
<Property Name="BackEndObjectType" Type="System.String">SqlServerTable</Property>
<Property Name="BackEndObject" Type="System.String">Items</Property>
<Property Name="Schema" Type="System.String">dbo</Property>
</Properties>
<Parameters>
<Parameter Direction="In" Name="@Identifier">
<TypeDescriptor TypeName="System.Int32" IdentifierName="Identifier" Name="Identifier" />
</Parameter>
<Parameter Direction="Return" Name="BaseItemsRead Item">
<TypeDescriptor TypeName="System.Data.IDataReader, System.Data, Version=2.0.0.0,
Culture=neutral, PublicKeyToken=b77a5c561934e089" IsCollection="true" Name="BaseItemsRead Item">
<TypeDescriptors>
<TypeDescriptor TypeName="System.Data.IDataRecord, System.Data, Version=2.0.0.0,
Culture=neutral, PublicKeyToken=b77a5c561934e089" Name="BaseItemsRead ItemElement">
<TypeDescriptors>
<TypeDescriptor TypeName="System.Int32" IdentifierName="Identifier" Name="Identifier"/>
<TypeDescriptor TypeName="System.Byte[], mscorlib, Version=2.0.0.0,
Culture=neutral, PublicKeyToken=b77a5c561934e089" IsCollection="true" Name="SecurityDescriptor">
<TypeDescriptors>
<TypeDescriptor TypeName="System.Byte" Name="SecurityDescriptorElement" />
</TypeDescriptors>
</TypeDescriptor>
</TypeDescriptors>
</TypeDescriptor>
</TypeDescriptors>
</TypeDescriptor>
</Parameter>
</Parameters>
<MethodInstances>
<MethodInstance Type="SpecificFinder" ReturnParameterName="BaseItemsRead Item"
ReturnTypeDescriptorName="BaseItemsRead ItemElement" Name="BaseItemsRead Item"
DefaultDisplayName="ReadSecurity">
<Properties>
<Property Name="WindowsSecurityDescriptorField" Type="System.String">
SecurityDescriptor
</Property>
</Properties>
</MethodInstance>
</MethodInstances>
</Method>
Steps to implement CRAWL time security trimming in .NET assembly connector
Implementing CRAWL time security trimmer in the .NET assembly connector requires the definition of a “BinarySecurityDescriptorAccessor” method. A “BinarySecurityDescriptorAccessor” method accepts at least an identifier and returns a BYTE array which represents a set of security principals and their associated permissions for the entity instance. . If your .NET assembly is connected with external system then you can get the Security descriptor value by passing the identifier to the external system provided the external should have the security descriptor value stored for every item. The security descriptor returned from the method becomes part of the search index and can be used to trim search results. For ex : if your BCS .NET assembly connector is connected with SQL external system then you can get the security descriptor value stored in the SQL table for every row and can built the ACL in the .NET assembly method and can return it.
NOTE : If you are adding any new set of permission then you need run the FULL CRAWL again since the ACL will be built during the Crawl and will be part of the index
For sample purpose, You can consider the following XML file, Which defines security policy associated with your external system. (This is to show you the sample considering to grant and deny access to set of users for the complete external system items, If you want to grant and deny access to set of users for different items then it’s always better to store the set of users in the external system, retrieve the row based on the identifier and retract the value of the security descriptor field and grant access to the user)
<?xml version="1.0" encoding="utf-8"?>
<Policy>
<Grant>
<Account Name="domain\syedi" />
</Grant>
<Deny>
<Account Name="domain\ibrahim" />
</Deny>
</Policy>
You first need to define a “BinarySecurityDescriptorAccessor” in the .NET assembly connector at a Minimum an “In” parameter which must be defined as the “Identifier” and a “Return” parameter as the ‘Byte’ array. You need to define a Filter Descriptor to pass in the user name of the current user who is Crawling the BCS content source (“DefaultContentAccess” account), This account will be used to build the ACL for each instance during the crawl. So you need to have two input (‘In’) parameter, an Identifier and the current logged on user who is crawling the content. Use the VS designer to create “BinarySecurityDescriptorAccessor” method instance with above told input and return parameters along with the Filter Descriptor.
After the method “BinarySecurityDescriptorAccessor” is defined in the VS designer, Visual Studio automatically stub out the implementation. In addition to the “BinarySecurityDescriptorAccessor” method you can also map the Security descriptor field using the “WindowsSecurityDescriptorField” property. If the “WindowsSecurityDescriptorField” property is not present, the “BinarySecurityDescriptorAccessor” method will be called whenever the security descriptor of the entity instance is needed. The BDCM for “BinarySecurityDescriptorAccessor” method instance will look like follows once you designed it in VS :
<Method Name="ReadSecurityDescriptor">
<FilterDescriptors>
<FilterDescriptor Name="UserFilter" Type="UserContext" />
</FilterDescriptors>
<Parameters>
<Parameter Name="Id" Direction="In">
<TypeDescriptor Name="IdTypeDescriptor" TypeName="System.String" IdentifierName="Identifier1" />
</Parameter>
<Parameter Name="User" Direction="In">
<TypeDescriptor Name="CurrentUser" TypeName="System.String" AssociatedFilter="UserFilter" />
</Parameter>
<Parameter Name="Acl" Direction="Return">
<TypeDescriptor Name="MySecurityDescriptor" TypeName="System.Collections.Generic.IEnumerable`1[System.Byte]" IsCollection="true">
<TypeDescriptors>
<TypeDescriptor Name="SecurityDescriptorByte" TypeName="System.Byte" IsCollection="false" />
</TypeDescriptors>
</TypeDescriptor>
</Parameter>
</Parameters>
<MethodInstances>
<MethodInstance Name="ReadSecurityDescriptorInstance" Type="BinarySecurityDescriptorAccessor" ReturnParameterName="Acl" ReturnTypeDescriptorPath="MySecurityDescriptor">
<Property Name="WindowsSecurityDescriptorField" Type="System.String">
MySecurityDescriptor
</Property>
</MethodInstance>
</MethodInstances>
</Method>
Now once the Method is defined in BDCM, You need to implement the code in the .NET assembly for the BinarySecurityDescriptorAccessormethod instance. In this case, You need to create a ACL based on the policy provided in the XML file, The following code shows the complete implementation of this method in .NET assembly connector
public static byte[] ReadSecurityDescriptor(string Id, stringUser)
{
Uri uri = new Uri("C:/BCSSecurity/policy.xml");
XDocument policyfile = XDocument.Load(uri.AbsoluteUri);
NTAccount oAcc = new NTAccount(User.Split('\\')[0], User.Split('\\')[1]);
SecurityIdentifier workerSid = (SecurityIdentifier)oAcc.Translate(typeof(SecurityIdentifier)); // Creates a Security Identifier for the currently logged on user who is crawling ; DefaultContentAccess account
CommonSecurityDescriptor csd = new CommonSecurityDescriptor(false, false, ControlFlags.None, workerSid, null, null,null); // Creates a Security descriptor to build the ACL by keeping the Crawling user as Owner (SecurityIdentifier) of the ACL
//populating the Common Security descriptor (ACL) with the set of user to whom to GRANT the access by reading it from the policy file
var grant = from account in policyfile.Descendants("Grant").First().Descendants("Account") selectaccount;
foreach (var acc ingrant)
{
NTAccount ntAcc = new NTAccount(acc.Attribute("Name").Value.Split('\\')[0], acc.Attribute("Name").Value.Split('\\')[1]);
SecurityIdentifier Sid = (SecurityIdentifier)ntAcc.Translate(typeof(SecurityIdentifier));
csd.DiscretionaryAcl.AddAccess(AccessControlType.Allow,Sid, unchecked((int)0xffffffffL), InheritanceFlags.None, PropagationFlags.None); // Add the user read from policy file to the Coomon Security Desciptor object with Allow Access Control type
}
//populating the Common Security descriptor (ACL) with the set of user to whom to DENY the access by reading it from the policy file
var deny = from account in policyfile.Descendants("Deny").First().Descendants("Account") selectaccount;
foreach (var acc indeny)
{
NTAccount ntAcc = new NTAccount(acc.Attribute("Name").Value.Split('\\')[0], acc.Attribute("Name").Value.Split('\\')[1]);
SecurityIdentifier Sid = (SecurityIdentifier)ntAcc.Translate(typeof(SecurityIdentifier));
csd.DiscretionaryAcl.AddAccess(AccessControlType.Deny, Sid, unchecked((int)0xffffffffL), InheritanceFlags.None, PropagationFlags.None); // Add the user read from policy file to the Coomon Security Desciptor object with Allow Access Control type
}
byte[] secDesc = new byte[csd.BinaryLength];
csd.GetBinaryForm(secDesc, 0);
returnsecDesc;
}
Code flow for Creation of ACL in the above method implementation :
The first thing happens in the BinarySecurityDescriptorAccessor method implementation is that the XML policy file is loaded and a new “NTAccount” object is created based on the identity of the current user. During Crawl time, The identity of the user will be the account used to crawl the external system content source (DefaultContentAccess account). The implementation code will make this account as the owner of the security descriptor for the entity instance. It will do this by creating a “SecurityIdentifier” and setting that as the owner for the CommonSecurityDescriptor. The CommonSecurityDescriptor is the object that will hold the Access Control List.
The next step is to run a LINQ query on the policy file loaded and return all the account that will be granted access (If you have stored the set of user permission information in external system as afield in the table then you can retrieve it from the back end, In the sample its retrieved from the policy file). For each of this account an entry is made in the access control list to grant access. Similarly, entries are made in the access control list to deny access for designated accounts in the policy file. Finally the CommonSecurityDescriptor object is transformed into a byte array and returned from the Method.
Once the method implementation is complete along with the “Finder” method instance with Root Finder Property or “IDEnumerator” Method instance, the .NET assembly connector may be deployed to SharePoint. From here it can be used to as a Content Source in the Search. The Finder or IDEnumerator method instance (In MOSS 2007, IDEnumerator method instance is mandatory for crawling, But in SPS 2010 either you can have IDEnumerator method instance or you can have a Finder method instance with Root Finder property description) will take care of Crawling and the BinarySecurityDescriptorAccessormethod instance will take care of Item level security trimming during crawling.
You can find the complete Project in the attachment.
Comments
Anonymous
April 09, 2013
Excellent article and very detailed and neat explanation. Thank you syedAnonymous
November 03, 2013
I've done the same way according to your article. But still ACL is not applied on my db items. Am I doing anything wrong? Please help.Anonymous
January 20, 2014
Syed, After investigating your solution, we've not been able to confirm it's viability when applied to SharePoint 2013. Do you know if there is an obvious reason (claims?) why the trimming function does not read the security descriptor correctly? It seems that SP2013 errors on the binary SID. Any help is appreciated.Anonymous
August 25, 2014
I've found that crawl time security checking doesn't completely address the issue. While it does an excellent job of trimming the search query results, it doesn't do anything to prevent users from accessing the content outside of search. For example, person A is authorized to view a record via crawl-time checking and they pass the ECT Profile URL to person B who is not authorized. Person B will be able to view the record just fine depending upon the data source configuration. If the data source is using a group credential from SSO where the data access is secured via that account, the External Data web parts don't invoke the BinarySecurityDescriptorAccessor to check security. You still need to apply the security to the back-end data source. This is all in the context of a SQL Server backend using the SQL Server data source type.Anonymous
August 25, 2014
I've found that crawl time security checking doesn't completely address the issue. While it does an excellent job of trimming the search query results, it doesn't do anything to prevent users from accessing the content outside of search. For example, person A is authorized to view a record via crawl-time checking and they pass the ECT Profile URL to person B who is not authorized. Person B will be able to view the record just fine depending upon the data source configuration. If the data source is using a group credential from SSO where the data access is secured via that account, the External Data web parts don't invoke the BinarySecurityDescriptorAccessor to check security. You still need to apply the security to the back-end data source. This is all in the context of a SQL Server backend using the SQL Server data source type.