Tuesday, August 5, 2008

Adding Default Content to a SharePoint Wiki

If you create a new Wiki through a list instance in a feature, through a site definition or programmatically, the default How To and Home page will not appear in that Wiki.

To add those page, the Microsoft.SharePoint.Utilities.SPUtility class offers the following public static method:

AddDefaultWikiContent(SPList wikiList)

Call this method to add those default pages. Or if you want more control over the content of the provisioned pages, use Reflector to disassemble the following DLL and have a look at the code in that method to see how it is done:

C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\ISAPI\Microsoft.SharePoint.dll

Another Way to Enrich Your SharePoint Forms - JQuery

I'm still enjoying that honeymoon period with a new tool in my SharePoint developer "bag of tricks" - JQuery. Such a powerful library!

As an example of its usefulness, in a recent project I needed to modify the behaviour of search scope checkboxes on an advanced search page in SharePoint (enforcing that the 'All Sites' checkbox is ticked if the user clears all other checkboxes).

Here's the steps I took using JavaScript plus JQuery:
1. Encapsulate the minified JQuery library in a SahrePoint feature in order to deploy across the farm.
2. Add a script tag referencing the JQuery JS file in the 'PlaceHolderTitleAreaClass' content placeholder of the search page.
3. Add a Content Editor Web Part to the search page, and open the source view for that web part.
4. Add the following JavaScript function to the content editor source (to enable custom JavaScript to be called on page load):

    1 function addLoadEvent(func){


    2     var oldonload = window.onload;


    3     if (typeof window.onload != 'function') {


    4         window.onload = func;


    5     } else {


    6         window.onload = function () {


    7             if (oldonload) {


    8                 oldonload();


    9             }


   10             func();


   11         }


   12     }


   13 }



5. Call a click handler processing function on page load with the following:
addLoadEvent(AddOnClickHandlers);

6. Add a function to assign onclick handlers to the targetted checkboxes. This is where the power of JQuery shines through - notice the short expression in line 5 that retrieves references to only the checkboxes I am seeking (these checkboxes are contained in table cells attributed with the class 'ms-advsrchText')

    1 //Add an onclick event handler for each of the checkboxes in the search form


    2 //(the checkboxes are identified by the 'ms-advsrchText' class applied to their containing table cells)


    3 function AddOnClickHandlers() {


    4   //Get the checkboxes in the form


    5   var chks = $('td.ms-advsrchText').find(':checkbox').get();


    6   for(var i=0;i<chks.length;i++)  {


    7     chks[i].onclick = SetFormState;


    8   }


    9 }



7. Write the onclick event handler - this needs to find the check box labelled "All Sites" and check its status. I have not included all the code, just a few lines showing more use of JQuery. Line 12 builds a selection expression to get the next sibling for each check box; the next sibling is a label whose text contains the scope name (e.g. "All Sites").

    1 function SetFormState() {


    2   //... OTHER CODE ...


    3 


    4   //Get the scope checkboxes in the search form


    5   var chks = $('td.ms-advsrchText').find(':checkbox').get();


    6 


    7   //Enumerate through the check boxes


    8   for(var i=0;i<chks.length;i++) {


    9 


   10     //Get the next sibling to the checkbox, which is a label element that contains the descriptive text.


   11     //This text is assumed to always be 'All Sites' for the main search scope


   12     var lbl = $('#' + chks[i].id + ' ~ label').get(0);


   13 


   14     //Save the reference to the All Sites checkbox


   15     if (lbl.innerText == 'All Sites') {




So, the actual code to get at the required elements on the page is very brief, and the range of JQuery selectors makes getting to the elements easy.

One tip - notice the use of the Get() method call at the end of the JQuery expressions. This returns DOM elements that can then be manipulated with normal DOM scripting.

Sunday, July 27, 2008

The SharePoint Paradox - Too Many Hammers

It happens with each new custom development I am involved with on the SharePoint platform - that initial temptation to write custom code, to head down the technical and complex route to a solution. Comes from having a background in software development, I guess!

The process of firing up Visual Studio and starting to code against the SharePoint APIs can thus become the "hammer" used to fix every "nail" (for "nail" read "business requirement").

But that is missing the point of SharePoint - there are plenty of built-in features that can greatly reduce the need for custom development. And hence reduce the cost to the client. Take one recent example from my work - a client required a web part that displayed documents created or modified by the current user, with some particular requirements around the way the metadata was displayed. Some of the functionality in this web part had me hovering the mouse over the Visual Studio start program link, but a little further design work revealed that the good ole' Data View (Data Form) web part could satisfy their needs.

The web part took around an hour to complete, with the necessary XSL jiggling. And one of the real benefits of taking this approach is the deployment - no WSP files to install on the server, no interruption to service. A simple text file to export and import.

Perhaps this is a paradox with SharePoint - as a development platform of significant breadth, it offers lots of different hammers for each nail. Me, I try (often have to remind myself!) to pick the lightest-weight hammer possible for each task. Lightweight = quicker development cycle = more agile and more responsive to customer requirements.

There is a proviso, of course (quite a major one) - the size of the client and their deployment/integration process may choose that hammer for you. The fast, lightweight approach that may be appropriate for clients with small, simple sites and a handful of users on a single server is unlikely to fit major organisations with strict control over the staging/QA/production environments. But even in that instance a Data View web part could be enveloped inside a SharePoint Feature and the deployment needs likely satisfied.

One step missing from the process when using this lightweight means of solving IT needs is the testing, or rather the inclusion of automated, repeateable testing. Someday I'll get a chance to look at the web test various frameworks around and to see whether NUnit combined with, say, WebAii or WATIN will help.

Wednesday, July 16, 2008

Getting that PDF Indexing to work in MOSS

Had a case where the latest Adobe IFilter had been installed, but the crawl log in MOSS was displaying "filtering process could not process this item" messages.

The key to fixing this issue was the information in this post from the Filter Center blog: checking the registry key value, and then adding the path to the Acrobat Reader folder were the solutions. With those in place, a full scan correctly indexed the PDF documents.

To save the search for the Adobe Acrobat PDF file icon logo for referencing in the DocIcons.xml file, here is the location of the GIF file on the Adobe site: http://www.adobe.com/misc/linking.html

Other useful aticles:

Tuesday, July 1, 2008

Hide the "View All Site Content" Quick Launch Link

To hide this link from all but those users with full control over a WSS site, open the site master page in SharePoint Designer and find the SPSecurityTrimmedControl element that contains a div with class ms-quicklaunchheader. The quickest route to this is to view the page in split view, and click on the "View All Site Content" link in the design pane.

The PermissionsString attribute of the SPSecurityTrimmedControl element determines what users can view this content. Change the value of this attribute to ManageWeb and only those users with rights to perform all admin tasks on the site will then be able to see the link on all pages in the site.

See this page for a list of all the possible permission string values.

Tuesday, June 17, 2008

WSS Search and Breadcrumb Links Fail over SSL

A site originally extended for extranet access needed to be now served over SSL, so I extended a new web application to enable this, and modified the external Alternate Access Mapping entry to be https rather than http.

The site was then successfully accessible on the internet, though I did notice that the full path had to be entered in the browser address bar, including the page name itself - for example, https://[siteurl]/default.aspx rather than just https://[siteurl].

Further testing revealed that the links in the breadcrumbs all led to a "Page cannot be diaplayed" IE error, and submitting a search on the site gave an error message on the results page that stated:
"If the URL should be serving existing content, the system administrator may need to add a new request url mapping to the intended application"


The cause was all in the AAM entry - I had a port number (80) in the extranet URLs from some early trials, and had not removed this port number when setting https. By setting the Extranet zone AAM internal URL and public URL to be simply https://[ssiteurl], these problems were solved.

Interesting that the pages were still accessible with an incorrect Access Mapping!

Monday, June 16, 2008

Double-check your hosting company's infrastructure

Story from the trenches today - a few weeks ago I customised a hosted WSS site, including UI changes and a small amount of functionality.

Somehow in one of the admin pages, the hosting company had exposed the capability to delete the complete site. So when my client did accidentally make use of that "feature", he called the hosting company to request a restore of the site.

Unfortunately, the hosting company discovered that they only actually have disaster recovery backup for an entire server farm. No application-level backup in place. It would take them 5 man days to retrieve the site.

Luckily, the site had not been used yet, so no data was lost. Just means that I have to rebuild from scratch... I had presumed that a hosted site would be safe, and so made no form of local backup.

So, in the words of Carl Franklin in Mondays (listen if you dare!), Things I have Learnt This Week:
  • Check the recovery options offered by a hosting company
  • Make your own backups of any customisations on hosted sites
  • Write reminders for yourself of any tricky little changes to make a site work as required
  • Take lots of screenshots

The latter point is worth reviewing. In this project I had taken a few screen shots of various pages within the site as development progressed, primarily to send to the client to show current state of play. Now I am using these screen shots to help recreate the site.

On the subject of screen shots. I use either Cropper or Faststone Capture for this task. Cropper is a free .net util, whereas Capture offers some nice easy image annotation options (plus ragged edges to images if required). Thoroughly recommend them both!