Creating Custom Connector Sending Claims with SharePoint 2013
What You Will Learn
This blog entry describes how to take an existing custom XML connector to the Search service application, modify the connector to submit security ACLs as claims, install and deploy for SharePoint 2013 and test it. Using this blog post, you can index external sources with its security model within SharePoint itself. Keep reading and we will show you how.
Why Use Claims
Claims have the potential to simplify authentication logic for external contents in SharePoint 2013. Claims-based identity enables applications to know a few facts about the user's permission to view content. Thus, we can render legacy or different security models into custom claims in SharePoint 2013.
Modifying the XML Connector to send ACL information as claims should be helpful to demonstrate how to "light up" external content in search results, i.e. be able to search securely in any content.
Requirements
- SharePoint 2013 Server
- Visual Studio 2012
The Starting Point: XML Connector
With this XML Connector as a starting point, we need to modify the connector code as it does not send security claims with its document submissions.
The next two sub-sections are "greatly copied" from Anders Fagerhaug's blog post. A huge thank-you and many kudos goes out to him for producing a great starting point for a wonderful custom connector!
Install Custom Connector (Thanks Anders!)
Download the zip archive attached to the bottom of this blog entry.
Unzip the contents of the zip archive to a folder on your computer, e.g. C:\CustomSecurityConnector
On the Start menu, choose All Programs, choose Microsoft Visual Studio 2010, and then choose Visual Studio Tools and open a Visual Studio command prompt.
To install the XmlFileConnector.dll, type the following command the command prompt
gacutil /i <ExtractedFolderPath>\XmlFileConnector\bin\Debug\XmlFileConnector.dll
This worked if the output after running gacutil is along the lines of "Assembly successfully added to the cache".
Merge the registry entries for the protocol handler by double-clicking on the registry file located at <ExtractedFolderPath>\xmldoc.reg.
On the Start menu, choose All Programs, then choose Microsoft SharePoint 2013 Products and open a SharePoint 2013 Management Shell as an administrator.
To configure the custom XML connector, at the command prompt, type the following command and run it:
$searchapp = Get-SPEnterpriseSearchServiceApplication -Identity "Search Service Application"
New-SPEnterpriseSearchCrawlCustomConnector -SearchApplication $searchapp –Protocol xmldoc -Name xmldoc –ModelFilePath "<ExtractedFolderPath>\XmlFileConnector\Model.xml"To confirm the configuration, at the command prompt, type the following command and run it:
Get-SPEnterpriseSearchCrawlCustomConnector -SearchApplication $searchapp
The expected output from this should be a protocol "xmldoc" with a ModeFileLocation pointing to the Model.xml given above.
Finally, we need to restart the search service, type the following commands in the command prompt:
net stop osearch15
net start osearch15
Create Crawled Property for the Custom XML Connector (Thanks Anders!)
When the custom XML connector crawls content, the crawled properties discovered during crawl will have to be added to a crawled property category. You need to create this category. Note: The user that performs this operation has to be an administrator for the Search service application.
On the Start menu, choose All Programs, then choose Microsoft SharePoint 2013 Products and open a SharePoint 2013 Management Shell as an administrator.
To create a new crawled property category, at the command prompt type the following commands and run them, where: <ConnectorName> is the name you want to give the custom XML connector, for example Custom XML Connector:
$searchapp = Get-SPEnterpriseSearchServiceApplication -Identity "Search Service Application"
New-SPEnterpriseSearchMetadataCategory -Name "<ConnectorName>" -Propset "BCC9619B-BFBD-4BD6-8E51-466F9241A27A" -searchApplication $searchappThe Propset GUID, BCC9619B-BFBD-4BD6-8E51-466F9241A27A, is hardcoded in the file XmlDocumentNamingContainer.cs and should not be changed.
To specify that if there are unknown properties in the newly created crawled property category, these should be discovered during crawl, at the command prompt, type and run the following:
$c = Get-SPEnterpriseSearchMetadataCategory -SearchApplication $searchapp -Identity "<ConnectorName>"
$c.DiscoverNewProperties = $true
$c.Update()where <ConnectorName> is the name you want to give the custom XML connector, for example Custom XML Connector.
Create Content Source for the XML Contents (Thanks Anders!)
To specify what, when and how the XML content should be crawled, you have to create a new content source for your XML content. Note: The user that performs this operation has to be an administrator for the Search service application.
On the home page of the SharePoint Central Administration website, in the Application Management section, choose Manage service applications
On the Manage Service Applications page, choose Search service application.
On the Search Service Administration Page, in the Crawling section, choose Content Sources.
On the Manage Content Sources page, choose New Content Source.
On the Add Content Source page, in the Name section, in the Name box, type a name for the new content source, for example XML Connector.
In the Content Source Type section, select Custom repository.
In the Type of Repository section, select xmldoc.
In the Start Address section, in the Type start addresses below (one per line) box, type the address from where the crawler should being crawling the XML content. The start address syntax is different depending on where the XML content is located.
XML content is located on a local drive, use the following syntax:
xmldoc://localhost/<XMLcontentfolder>/#x=doc:id;;urielm=url;;titleelm=title#XML content is located on a network drive, use the following syntax:
xmldoc://<SharedNetworkPath>/#x=doc:id;;urielm=url;;titleelm=title#XML content from the supplied example of this blog entry:
xmldoc://localhost/C$/CustomSecurityConnector/#x=Product:ID;;titleelm=Title;;urlelm=Url#where given that you extracted the zip archive of this blog post to C:\CustomSecurityConnector.
Verify that your newly created content source is shown on the Search Service Application page.
The CustomSecurityConnector folder given with the ZIP archive contains Product.xml, which is a sample of a small product catalog.
Create a Crawl Rule for Content Source
It is useful to create a crawl rule if we want to do post-Security trimming for these documents.
Go to SharePoint Central Admin, choose Search Administration.
Under the Crawling section, choose Crawl Rules and then New Crawl Rule
In the Path field, type in xmldoc://* which should match up the CrawlUrl of the documents we will crawl in our example Product.xml file. Even if the Url in our data states urls along the lines of https://wfe/site/catalog, remember that these are the display urls. The SharePoint gatherer/crawler needs the access URL given in the content source setting.
<!-- -->
<!-- Product -->
<!-- -->
<Product>
<ID>1</ID>
<Url>https://wfe/ site/catalog/item1</Url>
...
</Product>Select "Include all items in this path" and then create the rule by selecting OK
Crawl
Finally, start a full crawl of the newly created content source. Then select "View Crawl Logs" when the content source status goes back to the Idle status again. You should see 11 items successfully crawled.
Modifying the Connector to Send Custom Claims for Security
The Model.xml
First, we need to change the Model.xml of the connector. To enable sending claims in the connector, we need to submit a binary security descriptor, a boolean to say that we will be providing our own type of security and finally an optional string field (docaclmeta).
Essentially, we need to notify the connector framework of our security field type descriptors plus set a few properties to enable this in the model.
Let's start with the TypeDescriptors first. For every item that we wish to enforce custom security on, we have to set the type descriptors for the following fields:
- UsesPluggableAuth as a boolean field type (if the value of this field is true = custom security claims instead of the Windows Security descriptors next)
- SecurityDescriptor as a byte array for the actual encoded claims data
- docaclmeta as an optional string field, which will only be displayed in the search results if populated. This field is not queryable in the index.
In the model file itself, the added lines for TypeDescriptors are encapsulated by the XML comments for Claims Security, part 1/2, like this:
<Parameter Name="Return" Direction="Return">
<TypeDescriptor Name="Return" TypeName=... >
<TypeDescriptors>
<TypeDescriptor Name="Documents" TypeName=...>
<TypeDescriptors>
<TypeDescriptor Name="Item" TypeName=...>
<TypeDescriptors>
...
...
<!-- Claims Security Start, part 1/2 -->
<TypeDescriptor Name="UsesPluggableAuth" TypeName="System.Boolean" />
<TypeDescriptor Name="SecurityDescriptor" TypeName="System.Byte[]" IsCollection="true">
<TypeDescriptors>
<TypeDescriptor Name="Item" TypeName="System.Byte" />
</TypeDescriptors>
</TypeDescriptor>
<TypeDescriptor Name="docaclmeta" TypeName="System.String, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" />
<!-- Claims Security End , part 1/2 -->
...
...
We are almost done with the model file. The only thing we have left is to supply the field names for the UsesPluggableAuthentication, WindowsSecurityDescriptorField and the DocaclmetaField, like this:
<MethodInstances>
<Association Name="GetAllDocuments_Instance" Type="AssociationNavigator" ReturnParameterName="Return" ReturnTypeDescriptorPath="Return.Documents">
<Properties>
...
...
<!-- Claims Security Start, part 2/2 -->
<Property Name="UsesPluggableAuthentication" Type="System.String">UsesPluggableAuth</Property>
<Property Name="WindowsSecurityDescriptorField" Type="System.String">SecurityDescriptor</Property>
<Property Name="DocaclmetaField" Type="System.String">docaclmeta</Property>
<!-- Claims Security End , part 2/2 -->
...
...
The Entities.cs
We need to modify the corresponding C# code for the objects (i.e. documents) that the connector will submit with custom security ACLs. We have to adjust the Document class with the same fields as specified in the Model.xml:
- SecurityDescriptor
- UsesPluggableAuth
- docaclmeta
Thus, we added these three properties in the Entities.cs file as shown next:
public class Document
{
private DateTime lastModifiedTime = DateTime.Now;
public string Title { get; set; }
public string DocumentID { get; set; }
public string Url { get; set; }
public DateTime LastModifiedTime { get { return this.lastModifiedTime; } set { this.lastModifiedTime = value; } }
public DocumentProperty[] DocumentProperties { get; set; }
// Security Begin
public Byte[] SecurityDescriptor { get; set; }
public Boolean UsesPluggableAuth { get; set; }
public string docaclmeta { get; set; }
// Security End
}
The XmlFileLoader.cs
Finally, we need to modify the connector code to get the input data for the security data, translate this into a corresponding byte array of claims and set the proper field values of the Document class.
Key points to note:
- The UsesPluggableAuth should only be set to true if we indeed will be supplying claim ACLs with this document.
- Claims are encoded as a binary byte stream. The data type of this example is always of type string but this is no requirement of the SharePoint backend.
- The encoding is done according to the protocol documentation where:
- The first byte signals an allow or deny claim
- The second byte is always 1 to indicate that this is a non-NT security ACL (i.e. it is a claim ACL type)
- The next four bytes is the size of the following claim value array.
- The claim value string follows as a Unicode byte array.
- The next four bytes following the claim value array, gives the length of the claim type
- The claim type string follows as a Unicode byte array.
- The next four bytes following the claim type array, gives the length of the claim data type
- The claim data type string follows as a Unicode byte array
- The next four bytes following the claim data type array, gives the length of the claim original issuer
- The claim issuer string finally follows as a Unicode byte array
The Input XML
We modified the input XML from the original connector, adding the claimtype, claimvalue, claimissuer and claimaclmeta field to the input:
<!-- -->
<!-- Product -->
<!-- -->
<Product>
<ID>1</ID>
<Url>https://wfe/site/catalog/item1</Url>
<Title>Adventure Works Laptop15.4W M1548 White</Title>
<Item_x0020_Number>1010101</Item_x0020_Number>
<Group_x0020_Number>10101</Group_x0020_Number>
<ItemCategoryNumber>101</ItemCategoryNumber><ItemCategoryText>Laptops</ItemCategoryText>
<About>Laptop with ... </About>
<UnitPrice>$758,00</UnitPrice>
<Brand>Adventure Works</Brand>
<Color>White</Color>
<Weight>3.2</Weight>
<ScreenSize>15.4</ScreenSize>
<Memory>1000</Memory>
<HardDrive>160</HardDrive>
<Campaign>0</Campaign>
<OnSale>1</OnSale>
<Discount>-0.2</Discount>
<Language_x0020_Tag>en-US</Language_x0020_Tag>
<!-- Security Begin -->
<claimtype>https://surface.microsoft.com/security/acl</claimtype>
<claimvalue>user1</claimvalue>
<claimissuer>customtrimmer</claimissuer>
<claimaclmeta>access</claimaclmeta>
<!-- Security End -->
</Product>
With this sample input in XML, we have to modify the connector to pick up these extra XML tags.
We will create a GetXml method for this. For instance, we could illustrate one GetXml method where
- the first parameter is the tag name in the XML (e.g. "claimtype")
- the second parameter is the default return value for the GetXml method if the tag does not exist and finally,
- the third parameter is the XML element.
var claimType = GetXml(documentAclClaimTypeElmName, "https://demo.sharepoint.com/acl", elm);
var claimValue = GetXml(documentAclClaimValueElmName, "user1", elm);
var claimIssuer = GetXml(documentAclClaimIssuerElmName, "windows", elm);
var docAclMeta = GetXml(documentAclMetaElmName, null, elm);
With these variables properly filled out from the XML data, we need a method to transform these variables into a claims byte array. We will call this method GetSecurityAcl:
private static byte[] GetSecurityAcl(
string claimtype,
string claimvalue,
string claimissuer)
{
Byte[] spAcl = null;
if (!string.IsNullOrEmpty(claimtype) &&
!string.IsNullOrEmpty(claimvalue) &&
!string.IsNullOrEmpty(claimissuer))
{
using (var aclStream = new MemoryStream())
{
var dest = new BinaryWriter(aclStream);
AddClaimAcl(dest, false, claimtype, claimvalue, claimissuer);
dest.Flush();
spAcl = aclStream.ToArray();
}
}
return spAcl;
}
We need a method AddClaimAcl to add an encoded claim to a given byte stream:
private static void AddClaimAcl(
BinaryWriter dest,
bool isDeny,
string claimtype,
string claimvalue,
string claimissuer)
{
const string datatype = @"https://www.w3.org/2001/XMLSchema#string";
if (string.IsNullOrEmpty(claimvalue))
{
return;
}
dest.Write(isDeny ? (byte)1 : (byte)0); // Allow = 0, Deny = 1
dest.Write((byte)1); // Indicate that this is a non-NT claim type
//
// Claim Value
//
dest.Write((Int32)claimvalue.Length);
dest.Write(Encoding.Unicode.GetBytes(claimvalue));
//
// Claim Type
//
dest.Write((Int32)claimtype.Length);
dest.Write(Encoding.Unicode.GetBytes(claimtype));
//
// Claim Data Value Type
//
dest.Write((Int32)datatype.Length);
dest.Write(Encoding.Unicode.GetBytes(datatype));
//
// Claim Original Issuer
//
dest.Write((Int32)claimissuer.Length);
dest.Write(Encoding.Unicode.GetBytes(claimissuer));
}
Now we have all our variables filled out. Next, we will set the properties of the Document class using these variables:
var security = GetSecurityAcl(claimType, claimValue, claimIssuer);
var doc = new Document
{
DocumentID = id,
LastModifiedTime = fileModifiedTime,
UsesPluggableAuth = security != null,
SecurityDescriptor = security,
docaclmeta = docAclMeta
};
docs.Add(doc);
That is really it. The connector will read the XML security information, encode the claims and supply the optional docaclmeta field before sending this to the SharePoint 2013 indexing backend.
The next blog post will outline how to write a custom pre-security trimmer to unlock these documents for users who are entitled to view it.
Acknowledgements
Author: Sveinar Rasmussen (sveinar)
A huge thank-you goes out to Anders Fagerhaug (andersfa) and Armen Kirakosyan (armenk) for the original Custom XML Connector blog entry.
Comments
Anonymous
October 29, 2012
Wuaooo... fantastic article!!!Anonymous
March 12, 2013
Is there any documentation for the claims encoding? Would like to know if it is possible to encode several claims, or only oneAnonymous
March 12, 2013
Christoffer, the claims encoding is just a stream of data. It works till it hits the end of the stream. And multiple claims can be given here, just concatenate the bytes against each other and "ship it off". The documentation for this format was done as part of SP2010 protocol documentation effort. Tried to search up the Word file I used when blogging this but came up empty. Hopefully, you should just be able to reuse the existing code and put a "while loop" around the steps at "The encoding is done according to the protocol documentation where...".Anonymous
April 21, 2013
Great article!! Sveinar, if we let external users (Windows, non-windows, SQL) log in to SharePoint search site, via Forms Based Authentication, do we still have to implement the pre-trimmer? I thought if external users log in via FBA, then SharePoint itself performs the trimming, based on "SecurityDescriptor" we provide at the crawl time.Anonymous
April 22, 2013
With a web application using forms-based authentication, we will still see claims in a pre-trimmer. Remember, SharePoint uses claims for everything internally regardless of the FBA settings. As such, I would expect to find a similar FBA-sourced claim identity from the STS. As such, the corresponding sAMaccount claim from the external SQL set of users will work with the search service trimming without the search core knowing that the user comes from any forms-based authenticated approach. To the search engine, everything looks like a claim :-) More details on the FBA setup: technet.microsoft.com/.../ee806890.aspxAnonymous
April 22, 2013
Hi Sveinar, My understanding was, if we can provide "WindowsSecurityDescriptorField" to the crawler via BDC, and we have configured authentication provider with SharePoint, then we don't need to implement query time security trimming (pre-trimmer). Is that the idea? Or is there a different way of deciding whether to use crawl time security trimming or query time security trimming?Anonymous
April 22, 2013
The comment has been removedAnonymous
April 23, 2013
The comment has been removedAnonymous
April 25, 2013
The comment has been removedAnonymous
April 29, 2013
Hi Sveinar, Thanks for the clarification.Anonymous
May 15, 2013
Sveinar, thank you for this excellent article. We are currently using the XMLConnector and the Pre-Trimmer to secure some content! I have some follow-up questions that pertain to securing custom content in SP2013 search. We have a 2nd content source that we need to provide both allow and deny ACL's to the documents, and the ACL's come in with the content like this: S-1-5-11;S-1-5-21-3634641040-146604216-3188367109-1010 I have also read Murad's blog that compliments this: blogs.msdn.com/.../no-code-secure-search-with-fast-search-for-sharepoint-2010.aspx According to how I read your discussion with Sandun, all we need to do is package the ACL's and assign them to the correct fields, and then there is not a need for a pre nor post trimmer, as STS will pick up the user and filter the documents accordingly. If this is correct, what are the details to setting these ACL's up (if not correct, please clarify): 1. What is the final format of the ACL's: Base 32 encoded SID's separated by a semiColon, or a Byte array with all the SID's added? 2. What field should either the SID's or Byte array be mapped by the XML connector? WindowsSecurityDescriptorField or SecurityDescriptor? 3. UsesPluggableAuth should be set to false for these documents, correct? 4. How does your latest comment regarding "NT AuthorityAuthenticated Users" influence what is mapped to the content fields?Anonymous
May 20, 2013
Shane, I believe the best approach in your case is not to use custom claims at all :-) You could play directly with ACEs on Windows' own security descriptors returning a binary form of that for your WindowsSecurityDescriptorField in the BCS framework.
I believe I would just play with ACEs. For instance, you could create a CommonSecurityDescriptor using the SID in its constructor. With this constructed security descriptor, you could issue RemoveAccess and AddAccess to deny everyone and thus grant full access for specific groups or users by SIDs. When done, fire up a byte array with the CommonSecurityDescriptor.BinaryLength bytes. Call upon the GetBinaryForm to fill in your array. 2. Set the binary form byte array representation of the security descriptor in your WindowsSecurityDescriptorField of your connector. 3. Correct. False :-) 4. In this case, it should not make a difference. The Authenticated Users should represented as SID internally, i.e. security descriptor rather than a set of claims. Thus, you should be able to get away with non-claim ACLs using the SIDs alone. Hope this helps, Sveinar.
Anonymous
November 08, 2013
Hi, we have requirement for having Custom Security Groups(non-AD security groups) which are local to external source and those groups are not a part of AD . Below is what are looking to implement .please let me know how can we achieved ? 1)SP crawler has to index External source which is outside of share point that has it own security groups. 2)While crawling the document i want to set this security group which is local to that external source . 3)while user logged in with his own credentials ,we want trim the results based on logged user custom security group that is entitled to using pre-security trimming . Regards, RK.Anonymous
November 12, 2013
While I haven’t set up a similar case with multiple SharePoint farms where each farm has its own security groups, I do believe that one obstacle here to overcome is the ACLs that will be associated with the external source contents (as seen from the SP web-crawl). The general rule is to use the BCS framework to set the SecurityDescriptor with custom claims, where the custom claims are those other external (and different) security groups. Similarly, when the user logs on to query for content, a Pre-trimmer can be used here to perform an analogous mapping from the local user to its corresponding external security group memberships. Thus, this should be possible but it requires a custom BCS connector to supply the proper claims/ACLs on the content and a pre-trimmer to perform a successful secure lookup. With the SP Crawler, I do believe that it is not possible to customize the ACLs that it puts on the content crawled. For additional information, be sure to check out the blog post on “Creating a Custom Pre-Security Trimmer for SharePoint 2013”.Anonymous
November 20, 2013
Hi Sveinar, Very nice article. I've a doubt in the response which you gave to Shane in the first point. "create a CommonSecurityDescriptor using the SID in its constructor" What should be the SID here?? I tried something like below: SecurityIdentifier everyone = new SecurityIdentifier(WellKnownSidType.WorldSid, null); CommonSecurityDescriptor csd = new CommonSecurityDescriptor(false, false, ControlFlags.None, everyone, null, null, null); csd.SetDiscretionaryAclProtection(true, false); //To remove access to everyone csd.DiscretionaryAcl.RemoveAccess(AccessControlType.Allow, everyone, unchecked((int)0xffffffffL), InheritanceFlags.None, PropagationFlags.None); //To provide access to users separted by semicolon by looping for (int i = 0; i < users.Length; i++) { User = users[i]; NTAccount ntAcc = new NTAccount(User.Split('')[0], User.Split('')[1]); SecurityIdentifier Sid = (SecurityIdentifier)ntAcc.Translate(typeof(SecurityIdentifier)); csd.DiscretionaryAcl.AddAccess(AccessControlType.Allow, Sid, unchecked((int)0xffffffffL), InheritanceFlags.None, PropagationFlags.None); } sec = new byte[csd.BinaryLength]; csd.GetBinaryForm(sec, 0); return sec; With this code I'm not getting any exception. But when I crawl I'm getting the below error in crawl log: Error while crawling LOB contents. ( Error caused by exception: Microsoft.BusinessData.Runtime.RuntimeException MethodInstance with Name 'ReadSecurityDescriptorInstance' on Entity (External Content Type) with Name 'Entity1' in Namespace 'Sample_CST.BdcModel1_CST' failed unexpectedly. The failure occurred in method 'ReadSecurityDescriptor' defined in class 'Sample_CST.BdcModel1_CST.Entity1Service' with the message 'Some or all identity references could not be translated.'. ) Any views on this please. Thanks, ManjeeraAnonymous
November 24, 2013
The comment has been removedAnonymous
November 26, 2013
Hi Sveinar, In the security descriptor column if we have outlook distribution lists as values, will the SecurityIdentifier takes care of that? Thanks, ManjeeraAnonymous
March 11, 2016
Hi Sveinar I am a few years late to the party, but better late than never I suppose. I am facing a similar problem which you have solved in this blog and I am hoping you might be able to point me in the right direction. I have a client who has built a custom role provider based on SQL server, i.e. all of the users and defined rolesgroups are in SQL. This custom role provider has been associated with SharePoint and users can log in into SP using FBA, permissions are granted to the groups from the DB. In fact they are in the form of guid. SP 2013 search crawl can crawl the sites with these permissions perfectly OOTB. They also have DB content which I am indexing using a BCS connector and they would like to associate the DB data with the groups. I have configured my connector exactly the same way as you have described in this blog and pass the guid as the claimvalue, the name of the custom role provider as the claimissuer. The data get indexed but does not get returned as results. Reading all of the comments on here is making me wonder if I am going about this the correct way. Are you able to shed some light in my issue? Thanks in advanceAnonymous
March 12, 2016
Hello Jobstar40, sounds like you are right on the ball with the custom claims from the DB content. Have you provided the extra claim needed in a pre-security trimming that is using the exact same custom claim type that you provided in the BCS framework for these documents? I am thinking that you are missing a match-up on the query side claims to "unlock and surface" these documents for the GUIDs you provided :-) What is the custom claim type anyhow? Thanks, Sveinar.Anonymous
March 13, 2016
Hi Sveinar Thank you for replying so soon. Do I need to implement a pre-trimmer on the query side? This is the part I don’t fully understand. If, for example, some of the SharePoint content has already been secured by the custom role provider roles and we are able to index and search this content already using the OOTB search center, do I still need a pre-trimmer on the query side to be able to search the BCS content? If I understand your question, the role provider is already in place in SharePoint as users can log in with Windows Auth or FBA (which uses the custom role provider). The GUIDs I am storing as byte array with the DB content are pretty much exactly the same as the ones used in my SharePoint example above. The client has built a complex SQL Server role provider where they store users and groups in the DB and they have complex rules in place that govern what a logged in user cancannot see. Any help is appreciated at this point. Thanks Job- Anonymous
March 17, 2016
Hey Job Maelane,If you set up your WindowsSecurityDescriptorField with the byte array of SIDs in BCS, just as Manjeera did above, documents should show up if the same SIDs are available for the users logged into SharePoint. If this is hosted on-prem, that should be straight-forward. Perhaps you are in a hybrid environment which would require AD dir-sync? I am thinking that the BCS framework isn’t set up correctly to provide the proper securitydescriptor contents here perhaps, since you aren’t seeing any content. If you used a custom-claim type in your BCS setup like in the GetSecurityAcl(string claimtype, string claimvalue, string claimissuer) above, are you certain that the issuer matches up with the equivalent user claim at runtime? Maybe create a web part in SharePoint to dump all user claims to see what you got? https://visualstudiogallery.msdn.microsoft.com/6a708361-f534-4a2a-861e-b5f6efa9238c- Anonymous
March 23, 2016
Hi SveinarI have worked it out. I think I was getting confused with the "need" for a custom PreTrimmer and it turns out I needed one. I can see now see the correct results for the correct usersThanks.
- Anonymous
- Anonymous