Friday, November 25, 2011

SharePoint 2010 – Records Management

SharePoint 2010 – Records Management

Introducing Records Management in SharePoint 2010

managed-metadatas-term-store-programmatically

managed-metadatas-term-store-programmatically
managed-metadatas-term-store-programmatically2

managed-metadatas-term-store-programmatically 3
managed-metadatas-term-store-programmatically 4

Tags and Terms
****************
Use tags and notes to share information with colleagues


http://gai3kannan.wordpress.com/2011/03/21/tags-and-notes-in-sharepoint-2010/
http://www.c-sharpcorner.com/UploadFile/anavijai/7664/
.SharePoint 2010 released-style site created step by step – create a master page two
http://unknownerror.net/2011-06/sharepoint-2010-released-style-site-created-step-by-step-create-a-master-page-two-44410
http://how-2-do.blogspot.com/2008/05/aspnet-master-page-tutorial-part-10.html
http://www.wictorwilen.se/Post/Creating-custom-themable-CSS-files-for-SharePoint-2010.aspx

http://www.heathersolomon.com/blog/articles/MOSS07DesignComponents.aspx
http://rburgundy.wordpress.com/2010/03/10/sharepoint-2010-custom-masterpage-with-code-behind-file-%E2%80%93-part-1/

**********************************************
One of the biggest drawbacks of SharePoint 2007 was the management of content types, or rather, the lack of management of content types. Ensuring consistent content types across an enterprise sometimes drove me nuts, as well as the process to update and maintain them. Same goes for managing metadata.

For example, say I had a document content type called Contract that had a field called ContractType, I had several column types to choose from. I could use the lookup value from a list, which would give me flexibility in the contents, but required me to provision a list that is limited by the site collection boundary. If I would choose Choice, I would not have to provision a list, but my values in the column would be fixed and difficult to change across many sites.

Either way, adding, changing or removing metadata and their content types across several thousands of site collections was a big pain in the ****.

Well, Microsoft listened to these complaints and introduced the long awaited and personally my most favorite part in SharePoint 2010, the managed metadata service and content type syndication. The content type syndication will be part of the next article. In this article, we will focus on the managed metadata service and how we could use the API to control it.

Managed Metadata Service (MMS)

Well, the word says it all. It is a service to centrally manage metadata (called terms) in a consistent, managed and hierarchical manner. You can have one or multiple services of this type, although depending on the size of your organization, it would be logical to use only one.

The managed metadata service contains a Terms Store, which holds all the terms that would have any meaning within your organization and provides ways to group them together. For example, say I have a couple of departments within my company called:

•Finance
•Sales
•Procurement
•Development
•R&D
Each of these names would be a term in the taxonomy database. Each term can then be grouped together in what we call a Term Set, in our case this could be Departments. Finally, you can group Term Sets into Groups, for example Business Entities. What kind of cool is though is that you can nest terms to form a hierarchy. So each term can have a parent term and so on. This could result in something like this:

•Business Entities (Group)
◦Departments (Term Set)
■Finance
■Sales
■Procurement (Parent Term)
■Hardware (Term)
■Services
■Software
■Development
■R&D
Each group can have a different set of people assigned as managers or contributors, so you have options to delegate the management of that group throughout your company. For each term set, you can designate stakeholders that should be informed when the content of a set changes.

It is also possible to import terms based on comma separated value (csv) files. This provides the ability to prepare term sets upfront and load them in bulk into the service.

Enterprise Keywords


Some more good news, all the terms your end-users enter with SharePoint 2010 list items (so the column values) are also gathered in the Enterprise Keywords store. This is an unmanaged store that gathers anything users upload to their sites. From this store, you can easily promote terms to managed terms in the hierarchy, providing you valuable feedback of frequently used words/phrases (terms).

Managed Metadata column

So, we now have this cool store with our hierarchy build up, but how do we actually use this? Well, SharePoint 2010 introduces a new column type, called the Managed Metadata column. In this column definition, we can specify the term set to be used for the column, so that the values selected are gathered from the managed metadata service. I a way, it is much like the Lookup field column, only instead of a list, you now target the term set as the source.

Taxonomy API


Like so many features within SharePoint, it is cool we can do it from the interface, but even better if we can do the same from the object model. The object model exposes a complete set of API’s for management of the taxonomy, divided over 4 namespaces:

•Microsoft.SharePoint.Taxonomy
•Microsoft.SharePoint.Taxonomy.ContentTypeSync
•Microsoft.SharePoint.Taxonomy.Generic
•Microsoft.SharePoint.Taxonomy.WebServices
The first namespace is the most important one. Here we find all the classes needed to manage the managed metadata model. The second namespace has classes that will allow you to manage content type syndication (sharing) between site collections. We will get to that in the next article. The Generic namespace contains the generic collections for individual classes in the top level namespace. Finally, the webservices namespace provides the classes to manage the model through the webservices provided.

Although I will not explore the entire object model in this article, I do want to highlight some simple examples on how to manipulate the terms store, especially because this is what we will use frequently with multiple projects running on the same farm. We do not want to manually manage the term store, but use the object model to automate that. In the next example, I will load a set of terms (which will become a term set in the Terms Store) from a csv file and append it to an existing group.

Creating the Terms import file


Now, the content of this csv file obviously follows a specific format. Good news here is that you can also specify a hierarchy in this file. Bad news is that it will only do so for 7 levels, including the top level. For most scenarios however, this will do. There are no tools available as far as I know in constructing this file, which is something that I would expect in a first release. External vendors will dive upon this. The structure of the file should look like below:

“Term Set Name”,”Term Set Description”,”LCID”,”Available for Tagging”,”Term Description”,”Level 1 Term”,”Level 2 Term”,”Level 3 Term”,”Level 4 Term”,”Level 5 Term”,”Level 6 Term”,”Level 7 Term”
“Departments”,”This term set contains the departments in our company”,,TRUE,,,,,,,,
,,1033,TRUE,,”Finance”,,,,,,
,,1033,TRUE,,”Sales”,,,,,,
,,1033,TRUE,,”Procurement”,,,,,,
,,1033,TRUE,,”Procurement”,”Hardware”,,,,,
,,1033,TRUE,,”Procurement”,”Software”,,,,,
,,1033,TRUE,,”Procurement”,”Services”,,,,,
,,1033,TRUE,,”Development”,,,,,,
,,1033,TRUE,,”R&D”,,,,,,

Each line has 12 fields (11 commas) and not all fields need a value. The first line has the column headers and will always be the same. The second line contains the description of the term set. Any lines following that will contain the individual terms. I added the file as Terms.csv to my empty SharePoint 2010 project, using the Empty Element item type.



Pay special attention to the Terms.csv properties. Set the target location so, that it will be added to the root of your feature definition. This will make it easier to reference in our feature receiver, like so:



Create the feature and feature receiver

Next, I have created a feature and an associated feature event handler. This is to control how and when the terms should be added to the store. In the feature receiver, we are going to implement the FeatureActivated and FeatureDeactivating methods. First open the designer of the newly added feature and click Manifest at the bottom. Add the following XML to the feature definition:





This will provide a property to our feature that we can use in the code. It should look like below image:



Also, I set the ActivateOnDefault property to False, as well as the AutoActivateInCentralAdmin. I would like to control when to activate the feature. Experience from projects also shows that especially the ActivateOnDefault property can cause serious problems when your scope is set to WebApplication. The feature will then be enabled on each new web application, even if you do not want it to.

Now we can start coding. Implement the FeatureActivated method as follows:

public override void FeatureActivated(SPFeatureReceiverProperties properties)
{
// This feature will load the terms from a csv file and add them to the Terms Store.
// first, initiate the session
TaxonomySession session = new TaxonomySession(SPContext.Current.Site);
// obtain the term store
TermStore store = session.DefaultSiteCollectionTermStore;
// Create group that will hold my term sets
Group group = store.CreateGroup(“Boom-Business”);
// get the import manager
ImportManager manager = store.GetImportManager();
// output variables
bool allAdded = false;
string errorMessage;
// Construct the path to the source csv
string rootFeaturePath = properties.Feature.Definition.RootDirectory;
string file = properties.Feature.Properties["TermsFile"].Value;
string fileName = Path.Combine(rootFeaturePath, file);

try
{
using (StreamReader reader = new StreamReader(fileName))
{
// Use the manager to import the terms
manager.ImportTermSet(group, reader, out allAdded, out errorMessage);
if (!allAdded)
{
// report failure and rollback changes
EventLog.WriteEntry(“Boom.Taxonomy”, errorMessage, EventLogEntryType.Error, 101);
store.RollbackAll();
}
else
// Commit changes
store.CommitAll();
}
}
catch (Exception ex)
{
EventLog.WriteEntry(“Boom.Taxonomy”, ex.ToString(), EventLogEntryType.Error, 101);
store.RollbackAll();
}
}


Let us look at this more closely. First, we initiate a TaxonomySession by providing our current site as the context. We then open the TermStore that is associated to our site collection. When we have multiple services, we should use the TermStores property to get the correct store.

We now have the option to either add the term set to an existing group, or create a new one. In this case, I have created a new group, called “Boom-Business”. Next step is to get a reference to an ImportManager that will control the import process for us.

Next, construct the path to the source csv file by using the feature properties and its location on disk. When we have constructed the path, we call the ImportTermSet method of the ImportManager, passing the group to add the terms to and a TextReader (or descendant) that reads the file. The allAdded output variable should be set to true upon return, to indicate that all terms were imported successfully. Finally, we need to commit our changes to the TermsStore by calling CommitAll.

That is all there is to it. Obviously, we should also include code to remove the terms again if we deactivate the feature, which is coded in the FeatureDeactivating method.

public override void FeatureDeactivating(SPFeatureReceiverProperties properties)
{
// first, initiate the session
TaxonomySession session = new TaxonomySession(SPContext.Current.Site);
// obtain the term store
TermStore store = session.DefaultSiteCollectionTermStore;


try
{
// get the group from the store
Group group = store.Groups["Boom-Business"];
// get the TermSet
TermSet set = group.TermSets["Departments"];
// Delete the set
set.Delete();
// Delete the group
group.Delete();
// Commit the changes
store.CommitAll();
}
catch (Exception)
{
store.RollbackAll();
}
}

No rocket science here either. Again, setup a TaxonomySession, get the TermStore, get the Group and get the TermSet. We can remove the group and term set by calling their Delete methods. Finally, we commit the changes again in the store by calling the CommitAll method of the term store. In the catch section, I should have logged the exception, but I was somewhat lazy to the end

Build and deploy your project. I had a strange error though during deployment from Visual Studio. It returned an error during deployment that it could not find my assembly, although it was present in the Global Assembly Cache. However, when I deployed the WSP manually though stsadm (or PowerShell now ), all went well and the project deployed and worked without a problem. Just so you know…

Well, if everything went the way it supposed to, you can now add the terms by activating the feature in the Farm Features section. Obviously, the same result could be achieved by just uploading the csv file through the UI, but that is just not the way it will work in large enterprises with different teams.

Multiple Managed Metadata Services

Because the managed metadata service is a service application, you can have multiple instances. Each site collection could consume metadata from multiple instances, providing the possibility to create a service for each group of interest, skill group, operation unit that would desire their own metadata.

Considerations

So what can we conclude from above? Well, some things come to mind:

1.Adding, removing and managing the terms, term sets and groups is not so much of a problem. However, thinking about how you will construct your hierarchy and sound terms is another. This involves preparation and a lot of considerations that are more business then IT focused.
2.Setting up the management of the Term Store requires processes. You can assign different group managers to each Term Group, but you need to create the governance structure first. Then each manager can access the term store through their own Site Collection by navigating to Site Settings à Site Administration à Term Store Management. All the items are readable, but only the group(s) they are allowed to manage has write access.
3.Providing a programmatic feature that will read the term files (csv) from a central location with an approval workflow attached sounds promising. Using a timer job and approval process, selected controllers can add their terms to a document library. Then a line manager and IT support can validate and approve the file, where a timer job will then pick it up.
4.Processes, processes, processes. Doing this in a managed, standardized and governed way is the majority of the work.
In my next article, I will talk about content type syndication, or in plain English, the ability to share content types between multiple site collections using a publisher subscriber model. For those interested, I have uploaded the Visual Studio 2010 solution of the above example here.

How does Tagging Work in SharePoint 2010 for MySites – Microsoft Forum Question


Understanding Managed Metadata in SharePoint 2010 and its Impact in the Content Organizer


Creating Term Set in SharePoint 2010 Programmatically

Using the Managed Metadata Service in your SharePoint 2010 Sites-Part


Managed Metadata Columns


Navigating and Filtering Documents Using Managed Metadata

Add Lookup Column to SharePoint List Programmatically

Add Lookup Column to SharePoint List Programmatically

Create SharePoint list programmatically C#

SP2010 Basics: Client Object Model

Send Mail through code in Sharepoint using SMTP Mail


How to Create List and Add Item to that List using SharePoint 2010 client object model?

Wednesday, November 23, 2011

SharePoint 2010: Exception Handling Scope & SharePoint 2010 Page Rating using Social Data Service (SocialDataService.asmx)

SharePoint 2010 Page Rating using Social Data Service (SocialDataService.asmx)


Social Data Service


FREE SP 2010


*****************************************************************************
Error Handling
In this examples , I’ve left error handling and boundary checking code out for
the sake of brevity. Of course, in real-world code, we’d add these things and create suitable unit tests to validate their functionality. To make it possible for us to filter SharePoint specific errors in try/catch blocks, all SharePoint exceptions are derived from the SPException class.Earlier we looked at Sysinternals DebugView as a tool to assist in debugging problems in server-side code. Although we could use this as an error logging tool, SharePoint provides a better way to achieve the same result. Using code similar to the following sample, we can
write error logging entries to the SharePoint Unified Logging Service (ULS) logs:
try
{
//some code
}
catch (Exception ex)
{
SPDiagnosticsCategory myCat=new SPDiagnosticsCategory("A new category",
TraceSeverity.Monitorable,
EventSeverity.Error);
SPDiagnosticsService.Local.WriteEvent(1, myCat,
EventSeverity.Error,
"My custom message",
ex.StackTrace);
}
********************************************************************************


SharePoint 2010: Exception Handling Scope

Exception Handling Scope2

Bullet proof coding – SharePoint Error checking and Exception handling


How to use ULS in SharePoint 2010 for Custom Code Exception Logging?


SharePoint 2010 Cookbook: Using ULS Logging


Log your errors in SharePoint to Windows Event log or SharePoint Log files


404 Page Not Found error - after migrating from MOSS 2007 to SharePoint 2010
SharePoint Trace Logs (ULS) and Log Viewer

Tuesday, November 08, 2011

Missing Tag Cloud Web Part (and other Social Web Parts) after SharePoint 2010 Upgrade

Adding missing web parts with the PortalLayouts feature after SharePoint 2007 upgrade

1)Powershell:Enable-SPFeature –Identity PortalLayouts –url http://sp2010 –force
2)stsadm:stsadm -o activateFeature -name “PortalLayouts” -url “http://portal.wherever.com/sites/collectionname” –force
*******************************************
Digging into the SharePoint Portal Layouts Features

Some Social Collaboration Web Parts Missing?

How to Create Custom SharePoint 2010 Page Layouts using SharePoint Designer 2010

Missing "I Like It" in SharePoint 2010
Missing Tag Cloud Web Part (and other Social Web Parts) after SharePoint 2010 Upgrade
Social Note Board Web Part

SharePoint 2010 Activity Feed Explained – Part 1


Why SharePoint 2010 Social Features Suck
\

How To: Setup SharePoint 2010 Activity Feed using Outlook 2010 Social Connector

Sharepoint-facebook-wall

1)sharepoint-facebook-wall
*http://code.google.com/p/sharepoint-facebook-wall/wiki/GettingStarted*
2)Facebook webpart

3)SharePoint 2010 Facebook/Twitter Web Part

4)SharePoint 2010 Activity Feed – Enterprise Twitter

5)http://code-journey.com/

6)http://smarttools.codeplex.com/
Tab list install steps

6)Tabs List Web Part

Friday, November 04, 2011

How To Customize Current Navigation (Left Navigation) in SharePoint 2010 To Show Multiple Levels? How To Customize Current Navigation (Left Navigation) in SharePoint 2010 To Show Multiple Levels?

How To Customize Current Navigation (Left Navigation) in SharePoint 2010 To Show Multiple Levels?

SharePoint 2010 - Open PDF files in browser and set file association icon

SharePoint 2010 - Open PDF files in browser and set file association icon

SHAREPOINT 2010 Training

Configuring and Administering Microsoft SharePoint 2010

SHAREPOINT 2010 Training

limitations of sandboxed solutions sharepoint 2010
SharePoint 2010 Sandbox limitations:

1. No Security Elevation - RunWithElevatedPrivileges which runs the specified block of code in application pool account(typically System Account) context is not allowed in Sandbox code. SPSecurity class also not allowed to use in Sandbox.
2. No Email Support - SPUtility.SendMail method has been blocked explicitly in Sandbox, However .Net mail classes can be used to send mails. Additionaly sandbox won't allow to read Farm SMTP address. So developers has to specify the SMTP address in code itself(may be some other workaround).
3. No Support to WebPartPages Namespace - Sandbox won't allow to use Microsoft.SharePoint.WebPartPages namespace.
4. No Support to external Webservice - Internet web service calls are not allowed to ensure security in Sandbox solutions. Allow Partially Trusted code also can't be accessed within Sandbox.
5. No GAC Deployment - Sandbox solutions are not stored in File System(Physical path) and assemblies can't be deployed to Global Assembly Cache(GAC). But it's available on C:\ProgramData\Microsoft\SharePoint\UCCache at runtime. Note the ProgramData is a hidden folder.
6. No Visual Webparts - Visual Studio 2010 by default won't allow to create Visual Webparts to deploy as sandbox solution. But with Visual Studio PowerTools extensions(downloadable from Microsoft MSDN website) Visual Webparts can be developed and deployed as sandbox Solutions.
7. If you deploy same solution in mutliple site collections and in case when you want to upgrade that solution, it needs to be done one by one.
8. First, there are two types of Web Parts in Visual Studio—visual Web Parts and standard WebParts. A visual Web Part contains an .ascx control that you can use to visually design the look of the Web Part. Unfortunately, you cannot use visual Web Parts in the sandbox because sandboxed solutions can’t deploy files to the Web front end, which is where the .ascx files need to be deployed. This means that you must create a standard Web Part.
9. Almost all classes in the Microsoft.SharePoint.WebControls namespace are not available, which means you are mainly restricted to ASP.NET controls in sandboxed solutions.
10. Images, script files, and Features in sandboxed solutions are deployed to the content database, not to the file system of the front end servers.
11. Application pages, user controls (.ascx files), and localization resource (.resx) files cannot be deployed with a sandboxed solution. (There are other ways to localize sandboxed solutions.



What is Sharepoint 2010
http://sharepoint.microsoft.com/en-us/product/capabilities/Pages/default.aspx

Wednesday, November 02, 2011

Creating SharePoint 2010 Event Receivers in Visual Studio 2010

Creating SharePoint 2010 Event Receivers in Visual Studio 2010

Custom Content Types in SharePoint

List def - content type

Sharepoint 2010 event handler to create subsites


Creating Event receiver in SharePoint 2010


Programmatically creating Sites and Site Collections from a Custom Web Template


SharePoint Solution Package (WSP) Deployment


How To: Create custom content type and list instance in SharePoint 2010 using Visual Studio 2010


Using SharePoint Feature Receiver create Virtual Directory in IIS

SharePoint Object's Size limitation in MOSS 2007

SharePoint Object Limits
Site collections in a web application 50,000

Sites in a Site collection 250,000
Sub-sites nested under a Site 2,000
Lists on a Site 2,000
Items in a List 10,000,000
Items in View 2,000
Field Type in a list 256
Documents in a Library 2,000,000
Documents in a Folder 2,000
Maximum document file size 2 GB (by default 50MB)
Documents in an Index 50,000,000
Search Scopes 1,000
Search Scopes in a Site 200
User Profiles 5,000,000
Template size 10,000,000 (default)

**********************************************************************
Identify Worker Process (w3wp.exe) in IIS 6.0 and IIS 7.0 for Debugging in SharePoint
It is very deficult to identify perticular SharePoint web application's worker process in IIS 6.0 and 7.0 because there are somy web aplication running on SharePoint Farm.


How to attach worker process?
we can go to Tools > Attach Process or use shortcut key Ctrl +P.
Identify Worker Process in IIS 6.0

* Start > Run > Cmd
* Go To Windows > System32
* Run cscript iisapp.vbs

You will get the list of Running Worker ProcessID and the Application Pool Name in IIS 6.0.


Identify Worker Process in IIS 7.0
From IIS 7.0 you need you to run IIS Command Tool ( appcmd ) .

* Start > Run > Cmd
* Go To Windows > System32 > Inetsrv
* Run appcmd list wp

This will show you list worker process that is running on IIS 7.0 in the similar format of IIS 6.0

Happy Debugging...!!!

SPBasePermissions Enumeration



MOSS 3.0 to SharePoint 2010 Migration

1. Take a Content Database backup from MOSS 3.0 DB Server.
2. Restore it in SharePoint 2010 DB Server.
3. Add Restored Database in SP 2010 existing Web Application.
4. Remove older Content Database (which created with web application creation time)
5. Move all related files from MOSS 3.0 to SP 2010 farm. Ex. – Customization (Features, User Controls, Web Parts, Custom Application Pages, and Timer Jobs etc.), Third Party controls etc.
6. Run PreUpgradeCheck Power Shell script from SharePoint 2010 Management Shell (I will we back with new post)
7. After successfully run this command browse you site it should be working with UI Version 3.0.
8. For SharePoint 2010 look & feel you have to change UI Version 4.0 (Visual Upgrade see my new post).


Happy Migration !!!
Visual Upgrade in SharePoint 2010
There so many ways to Visual Upgrade Migrated SharePoint Site.
One of this is run power shell script from SharePoint 2010 Management Shell.

All Site Collection Upgrade under Single Web Application as below command
$webapp = Get-SPWebApplication http://UIVersion3WebApps/
foreach ($s in $webapp.sites)
{$s.VisualUpgradeWebs() }


Site Collection Upgrade including All Sub Webs as below command
$site = Get-SPSite http://UIVersion3SiteCollection/
foreach ($web in $site.AllWebs){$web.UIVersion = 4;$web.Update();}

Only Site Collection Web Upgrade without All Sub Webs as below command
$site = Get-SPSite http://UIVersion3SiteCollection/
$site.VisualUpgradeWebs()

Only Single Web Upgrade as below command
$web = Get-SPWeb http://UIVersion3SiteCollection/site
$web.UIVersion = 4;
$web.Update();


Happy Upgrading !!!

Access FBA SharePoint site using web service
You can access any site collection programmatically as below code
You have to use Authentication.asmx web service for authenticate any user.
Then you can access any lists of this site.

// /_vti_bin/Authenticate.asmx web service's Object
Authentication.Authentication objAuthentication = new Authentication.Authentication();
objAuthentication.CookieContainer = new System.Net.CookieContainer();
LoginResult loginResule = objAuthentication.Login("username", "password");

if (loginResule.ErrorCode == LoginErrorCode.NoError)
{
CookieCollection objCookieCollection = objAuthentication.CookieContainer.GetCookies(new Uri(objAuthentication.Url));
Cookie authCookie = objCookieCollection[loginResule.CookieName];

// /_vti_bin/Lists.asmx web service's Object
Lists.Lists objLists = new Lists.Lists();
objLists.CookieContainer = new CookieContainer();
objLists.CookieContainer.Add(authCookie);

System.Xml.XmlNode objListXMLNode = objLists.GetList("Shared Documents");
}


Happy Coding !!!

Understanding webtemp*.xml

Understanding webtemp*.xml

ASP.NET Caching

WCF vs. ASMX
Protocols Support

* WCF
o HTTP
o TCP
o Named pipes
o MSMQ
o Custom
o UDP
* ASMX
o HTTP only

Hosting

* ASMX
o Can be hosted only with HttpRuntime on IIS.
* WCF
o A WCF component can be hosted in any kind of environment in .NET 3.0, such as a console application, Windows application, or IIS.
o WCF services are known as 'services' as opposed to web services because you can host services without a web server.
o Self-hosting the services gives you the flexibility to use transports other than HTTP.

WCF Backwards Compatibility

* The purpose of WCF is to provide a unified programming model for distributed applications.
* Backwards compatibility
o WCF takes all the capabilities of the existing technology stacks while not relying upon any of them.
o Applications built with these earlier technologies will continue to work unchanged on systems with WCF installed.
o Existing applications are able to upgrade with WCF
o New WCF transacted application will work with existing transaction application built on System.Transactions

WCF & ASMX Integration

* WCF can use WS-* or HTTP bindings to communicate with ASMX pages

Limitations of ASMX:

* An ASMX page doesn’t tell you how to deliver it over the transports and to use a specific type of security. This is something that WCF enhances quite significantly.
* ASMX has a tight coupling with the HTTP runtime and the dependence on IIS to host it. WCF can be hosted by any Windows process that is able to host the .NET Framework 3.0.
* ASMX service is instantiated on a per-call basis, while WCF gives you flexibility by providing various instancing options such as Singleton, private session, per call.
* ASMX provides the way for interoperability but it does not provide or guarantee end-to-end security or reliable communication.

References:

Migrating ASP.NET Web Services to WCF




Sharepoint Architecture - IIS 6.0, ASP.NET & ISAPI


Handling Sharepoint List Events and Register

Why LINQ

Shared services [Moss 2007]



Create a Page Layout from scratch


Lists and Libraries in the Object Model


MOSS 2007 Tips


Solution Development


Site Columns & Content Types


Web Parts Fundamentals


SharePoint Search

Page_Init vs Page_Load



SharePoint 2007 content types


Gridview hyperlinkfield


Encrypt Web.config


Gridview confirm delete


SharePoint 2007 Libraries & Lists



Web Templates


Client Object Model – Get all lists in a site


Client Object Model : Create a new sub site


Activating a feature using object model


Creating a Folder in List Attachments folder using client object model or Web Service is not allowed


Client Object Model : Delete all sub sites

SharePoint Feature Stapling

SharePoint Feature Stapling
Feature Stapling and the FeatureActivated Event in Windows SharePoint Services 3.0
SharePoint 2010 Content Type Hub

Content Type Hub FAQ and Limitations



http://blogs.msdn.com/b/chaks/archive/tags/cthub/


SharePoint 2010 Content Type Publishing

SPList Export for SharePoint 2010 (SPListX) version 4.0 released!!

SPList Export for SharePoint 2010 (SPListX) version 4.0 released!!

SharePoint URL Shortener

SharePoint URL Shortener – Feature Highlight

Blog Archive