Friday, November 25, 2011

managed-metadatas-term-store-programmatically

managed-metadatas-term-store-programmatically
managed-metadatas-term-store-programmatically2

managed-metadatas-term-store-programmatically 3
managed-metadatas-term-store-programmatically 4

Tags and Terms
****************
Use tags and notes to share information with colleagues


http://gai3kannan.wordpress.com/2011/03/21/tags-and-notes-in-sharepoint-2010/
http://www.c-sharpcorner.com/UploadFile/anavijai/7664/
.SharePoint 2010 released-style site created step by step – create a master page two
http://unknownerror.net/2011-06/sharepoint-2010-released-style-site-created-step-by-step-create-a-master-page-two-44410
http://how-2-do.blogspot.com/2008/05/aspnet-master-page-tutorial-part-10.html
http://www.wictorwilen.se/Post/Creating-custom-themable-CSS-files-for-SharePoint-2010.aspx

http://www.heathersolomon.com/blog/articles/MOSS07DesignComponents.aspx
http://rburgundy.wordpress.com/2010/03/10/sharepoint-2010-custom-masterpage-with-code-behind-file-%E2%80%93-part-1/

**********************************************
One of the biggest drawbacks of SharePoint 2007 was the management of content types, or rather, the lack of management of content types. Ensuring consistent content types across an enterprise sometimes drove me nuts, as well as the process to update and maintain them. Same goes for managing metadata.

For example, say I had a document content type called Contract that had a field called ContractType, I had several column types to choose from. I could use the lookup value from a list, which would give me flexibility in the contents, but required me to provision a list that is limited by the site collection boundary. If I would choose Choice, I would not have to provision a list, but my values in the column would be fixed and difficult to change across many sites.

Either way, adding, changing or removing metadata and their content types across several thousands of site collections was a big pain in the ****.

Well, Microsoft listened to these complaints and introduced the long awaited and personally my most favorite part in SharePoint 2010, the managed metadata service and content type syndication. The content type syndication will be part of the next article. In this article, we will focus on the managed metadata service and how we could use the API to control it.

Managed Metadata Service (MMS)

Well, the word says it all. It is a service to centrally manage metadata (called terms) in a consistent, managed and hierarchical manner. You can have one or multiple services of this type, although depending on the size of your organization, it would be logical to use only one.

The managed metadata service contains a Terms Store, which holds all the terms that would have any meaning within your organization and provides ways to group them together. For example, say I have a couple of departments within my company called:

•Finance
•Sales
•Procurement
•Development
•R&D
Each of these names would be a term in the taxonomy database. Each term can then be grouped together in what we call a Term Set, in our case this could be Departments. Finally, you can group Term Sets into Groups, for example Business Entities. What kind of cool is though is that you can nest terms to form a hierarchy. So each term can have a parent term and so on. This could result in something like this:

•Business Entities (Group)
◦Departments (Term Set)
■Finance
■Sales
■Procurement (Parent Term)
■Hardware (Term)
■Services
■Software
■Development
■R&D
Each group can have a different set of people assigned as managers or contributors, so you have options to delegate the management of that group throughout your company. For each term set, you can designate stakeholders that should be informed when the content of a set changes.

It is also possible to import terms based on comma separated value (csv) files. This provides the ability to prepare term sets upfront and load them in bulk into the service.

Enterprise Keywords


Some more good news, all the terms your end-users enter with SharePoint 2010 list items (so the column values) are also gathered in the Enterprise Keywords store. This is an unmanaged store that gathers anything users upload to their sites. From this store, you can easily promote terms to managed terms in the hierarchy, providing you valuable feedback of frequently used words/phrases (terms).

Managed Metadata column

So, we now have this cool store with our hierarchy build up, but how do we actually use this? Well, SharePoint 2010 introduces a new column type, called the Managed Metadata column. In this column definition, we can specify the term set to be used for the column, so that the values selected are gathered from the managed metadata service. I a way, it is much like the Lookup field column, only instead of a list, you now target the term set as the source.

Taxonomy API


Like so many features within SharePoint, it is cool we can do it from the interface, but even better if we can do the same from the object model. The object model exposes a complete set of API’s for management of the taxonomy, divided over 4 namespaces:

•Microsoft.SharePoint.Taxonomy
•Microsoft.SharePoint.Taxonomy.ContentTypeSync
•Microsoft.SharePoint.Taxonomy.Generic
•Microsoft.SharePoint.Taxonomy.WebServices
The first namespace is the most important one. Here we find all the classes needed to manage the managed metadata model. The second namespace has classes that will allow you to manage content type syndication (sharing) between site collections. We will get to that in the next article. The Generic namespace contains the generic collections for individual classes in the top level namespace. Finally, the webservices namespace provides the classes to manage the model through the webservices provided.

Although I will not explore the entire object model in this article, I do want to highlight some simple examples on how to manipulate the terms store, especially because this is what we will use frequently with multiple projects running on the same farm. We do not want to manually manage the term store, but use the object model to automate that. In the next example, I will load a set of terms (which will become a term set in the Terms Store) from a csv file and append it to an existing group.

Creating the Terms import file


Now, the content of this csv file obviously follows a specific format. Good news here is that you can also specify a hierarchy in this file. Bad news is that it will only do so for 7 levels, including the top level. For most scenarios however, this will do. There are no tools available as far as I know in constructing this file, which is something that I would expect in a first release. External vendors will dive upon this. The structure of the file should look like below:

“Term Set Name”,”Term Set Description”,”LCID”,”Available for Tagging”,”Term Description”,”Level 1 Term”,”Level 2 Term”,”Level 3 Term”,”Level 4 Term”,”Level 5 Term”,”Level 6 Term”,”Level 7 Term”
“Departments”,”This term set contains the departments in our company”,,TRUE,,,,,,,,
,,1033,TRUE,,”Finance”,,,,,,
,,1033,TRUE,,”Sales”,,,,,,
,,1033,TRUE,,”Procurement”,,,,,,
,,1033,TRUE,,”Procurement”,”Hardware”,,,,,
,,1033,TRUE,,”Procurement”,”Software”,,,,,
,,1033,TRUE,,”Procurement”,”Services”,,,,,
,,1033,TRUE,,”Development”,,,,,,
,,1033,TRUE,,”R&D”,,,,,,

Each line has 12 fields (11 commas) and not all fields need a value. The first line has the column headers and will always be the same. The second line contains the description of the term set. Any lines following that will contain the individual terms. I added the file as Terms.csv to my empty SharePoint 2010 project, using the Empty Element item type.



Pay special attention to the Terms.csv properties. Set the target location so, that it will be added to the root of your feature definition. This will make it easier to reference in our feature receiver, like so:



Create the feature and feature receiver

Next, I have created a feature and an associated feature event handler. This is to control how and when the terms should be added to the store. In the feature receiver, we are going to implement the FeatureActivated and FeatureDeactivating methods. First open the designer of the newly added feature and click Manifest at the bottom. Add the following XML to the feature definition:





This will provide a property to our feature that we can use in the code. It should look like below image:



Also, I set the ActivateOnDefault property to False, as well as the AutoActivateInCentralAdmin. I would like to control when to activate the feature. Experience from projects also shows that especially the ActivateOnDefault property can cause serious problems when your scope is set to WebApplication. The feature will then be enabled on each new web application, even if you do not want it to.

Now we can start coding. Implement the FeatureActivated method as follows:

public override void FeatureActivated(SPFeatureReceiverProperties properties)
{
// This feature will load the terms from a csv file and add them to the Terms Store.
// first, initiate the session
TaxonomySession session = new TaxonomySession(SPContext.Current.Site);
// obtain the term store
TermStore store = session.DefaultSiteCollectionTermStore;
// Create group that will hold my term sets
Group group = store.CreateGroup(“Boom-Business”);
// get the import manager
ImportManager manager = store.GetImportManager();
// output variables
bool allAdded = false;
string errorMessage;
// Construct the path to the source csv
string rootFeaturePath = properties.Feature.Definition.RootDirectory;
string file = properties.Feature.Properties["TermsFile"].Value;
string fileName = Path.Combine(rootFeaturePath, file);

try
{
using (StreamReader reader = new StreamReader(fileName))
{
// Use the manager to import the terms
manager.ImportTermSet(group, reader, out allAdded, out errorMessage);
if (!allAdded)
{
// report failure and rollback changes
EventLog.WriteEntry(“Boom.Taxonomy”, errorMessage, EventLogEntryType.Error, 101);
store.RollbackAll();
}
else
// Commit changes
store.CommitAll();
}
}
catch (Exception ex)
{
EventLog.WriteEntry(“Boom.Taxonomy”, ex.ToString(), EventLogEntryType.Error, 101);
store.RollbackAll();
}
}


Let us look at this more closely. First, we initiate a TaxonomySession by providing our current site as the context. We then open the TermStore that is associated to our site collection. When we have multiple services, we should use the TermStores property to get the correct store.

We now have the option to either add the term set to an existing group, or create a new one. In this case, I have created a new group, called “Boom-Business”. Next step is to get a reference to an ImportManager that will control the import process for us.

Next, construct the path to the source csv file by using the feature properties and its location on disk. When we have constructed the path, we call the ImportTermSet method of the ImportManager, passing the group to add the terms to and a TextReader (or descendant) that reads the file. The allAdded output variable should be set to true upon return, to indicate that all terms were imported successfully. Finally, we need to commit our changes to the TermsStore by calling CommitAll.

That is all there is to it. Obviously, we should also include code to remove the terms again if we deactivate the feature, which is coded in the FeatureDeactivating method.

public override void FeatureDeactivating(SPFeatureReceiverProperties properties)
{
// first, initiate the session
TaxonomySession session = new TaxonomySession(SPContext.Current.Site);
// obtain the term store
TermStore store = session.DefaultSiteCollectionTermStore;


try
{
// get the group from the store
Group group = store.Groups["Boom-Business"];
// get the TermSet
TermSet set = group.TermSets["Departments"];
// Delete the set
set.Delete();
// Delete the group
group.Delete();
// Commit the changes
store.CommitAll();
}
catch (Exception)
{
store.RollbackAll();
}
}

No rocket science here either. Again, setup a TaxonomySession, get the TermStore, get the Group and get the TermSet. We can remove the group and term set by calling their Delete methods. Finally, we commit the changes again in the store by calling the CommitAll method of the term store. In the catch section, I should have logged the exception, but I was somewhat lazy to the end

Build and deploy your project. I had a strange error though during deployment from Visual Studio. It returned an error during deployment that it could not find my assembly, although it was present in the Global Assembly Cache. However, when I deployed the WSP manually though stsadm (or PowerShell now ), all went well and the project deployed and worked without a problem. Just so you know…

Well, if everything went the way it supposed to, you can now add the terms by activating the feature in the Farm Features section. Obviously, the same result could be achieved by just uploading the csv file through the UI, but that is just not the way it will work in large enterprises with different teams.

Multiple Managed Metadata Services

Because the managed metadata service is a service application, you can have multiple instances. Each site collection could consume metadata from multiple instances, providing the possibility to create a service for each group of interest, skill group, operation unit that would desire their own metadata.

Considerations

So what can we conclude from above? Well, some things come to mind:

1.Adding, removing and managing the terms, term sets and groups is not so much of a problem. However, thinking about how you will construct your hierarchy and sound terms is another. This involves preparation and a lot of considerations that are more business then IT focused.
2.Setting up the management of the Term Store requires processes. You can assign different group managers to each Term Group, but you need to create the governance structure first. Then each manager can access the term store through their own Site Collection by navigating to Site Settings à Site Administration à Term Store Management. All the items are readable, but only the group(s) they are allowed to manage has write access.
3.Providing a programmatic feature that will read the term files (csv) from a central location with an approval workflow attached sounds promising. Using a timer job and approval process, selected controllers can add their terms to a document library. Then a line manager and IT support can validate and approve the file, where a timer job will then pick it up.
4.Processes, processes, processes. Doing this in a managed, standardized and governed way is the majority of the work.
In my next article, I will talk about content type syndication, or in plain English, the ability to share content types between multiple site collections using a publisher subscriber model. For those interested, I have uploaded the Visual Studio 2010 solution of the above example here.

How does Tagging Work in SharePoint 2010 for MySites – Microsoft Forum Question


Understanding Managed Metadata in SharePoint 2010 and its Impact in the Content Organizer


Creating Term Set in SharePoint 2010 Programmatically

Using the Managed Metadata Service in your SharePoint 2010 Sites-Part


Managed Metadata Columns


Navigating and Filtering Documents Using Managed Metadata

No comments:

Blog Archive