Tuesday, May 27, 2014

A hidden benefit of presenting...

A big benefit of presenting technical subjects is the surprising new tangents that open to widen your existing knowledge of a subject. When preparing demos for SharePoint or JavaScript sessions, I often find that an investigation into data or techniques to enliven an example will uncover lots of new ideas and approaches that can benefit my every day work.

As an example, recently I was preparing a session on Visualising data in SharePoint through the use of OData and client-side code, and came across the wonders of Crossfilter for analysing/processing large data sets on the client. That (and the similar library "PourOver") look to make this type of data presentation very efficient. Looking forward to doing more with them.

Have you used these libraries yourself? Any good or bad experiences to share?

Wednesday, March 26, 2014

More "Small Steps in SharePoint" guides

Today I have published two more Small Steps in SharePoint (and Office 365) guides to achieving everyday tasks in the system:
If you have any suggestions or requests on other tasks I could cover in these guides, please let me know and I will add new ones to the set.

Thursday, March 20, 2014

How to upload a file to SharePoint 2013 in Small Steps

Here's the first in a series of "Small Steps in SharePoint" guides on how to complete small tasks (and how to make small adjustments to enhance your usage of SharePoint).

This one is a beginners guide in four steps to adding a document to a SharePoint library:

http://www.smallsteps.co.nz/UploadAFileToSharepoint.aspx

Tuesday, October 15, 2013

Reducing all that White Space in a SharePoint 2013 Page

Web part pages in SharePoint 2013 render with large spaces between the web part zones. It turns out to be caused by a border-spacing style applied to the ms-webpartPage-root class. Adjusting this to 0px removes lots of the unnecessary white space. Now white space is a Good Thing in design (but in the case of these web part pages it seems to be a little too much of a good thing!). One way is to include the following CSS:
.ms-webpartPage-root { border-spacing:0px !important; }

Monday, August 12, 2013

Solving Office 365 SkyDrive Pro Sync Problem (error 0x80040208)

SkyDrive Pro eases the process of uploading lots of files into a SharePoint document library - but can stop working if a file added to the SkyDrive Pro folder has a name that is longer than 64 characters.

When a file with a long file name is dragged into the SkyDrive Pro folder, you may find that anyone with a folder mapped to the associated SharePoint document library start seeing messages about syncing problems. The SkyDrive Pro folder icon will also show an error in Windows Explorer.

One way to see if long filenames could be causing these problems is to search for all files in a library with long file names. To find all such files for a client, I created the following JavaScript that uses a CSOM (Client Side Object Model) query to get all files in a library (even if the files are located in folders in the library) and to display a list of all the problematic file names.

To use this code, add a new page to your site, edit the page and add "embedded script" into that page. Edit the source of that script, and paste the following HTML into the page - be sure to adjust the value in hostweburl to match the URL of your Office 365 site, and the value in documentLibraryTitle to match the display title of the document library you wish to test.

For more notes on the code, see my article at on this subject on my site

This code could be extended with a little work to offer all the document libraries in a site in a dropdown - if you have any ideas for extensions, or want help with the code, please leave a comment
<script src="//ajax.aspnetcdn.com/ajax/jQuery/jquery-1.7.2.min.js" type="text/javascript"></script>
<script type="text/javascript">
    var hostweburl; //Web URL
 var ctx; // Client context
 var documentLibraryTitle = 'Documents'; //The title of the document library in which to seek long filenames

    $(document).ready(function () {
  //Full URL to the Office 365 site collection root web
        hostweburl = 'https://yoursite.sharepoint.com';
        // The js files are in a URL in the form: web_url/_layouts/15/resource_file
        var scriptbase = hostweburl + "/_layouts/15/";
        // Load the js files and continue to the execOperation function.
        $.getScript(scriptbase + "SP.Runtime.js",
            function () {
                $.getScript(scriptbase + "SP.js", execOperation);
            }
        );
 });

 //Main process, runs once the SP.Runtime.js and SP.js files have dynamically loaded
    function execOperation() {
  ctx = new SP.ClientContext();
  retrieveItems();
    }

 function retrieveItems() {
  var clientContext = new SP.ClientContext();
  
  this.oList = clientContext.get_web().get_lists().getByTitle(documentLibraryTitle);
   
  //Could use the following to create the query text "<View Scope='RecursiveAll'><Query></Query></View>
  //var camlQuery = SP.CamlQuery.createAllItemsQuery();
  //BUT this does not sort, and we need to sort by folder here
  
  var camlQuery = new SP.CamlQuery();
  //Get all documents from all folders
  var query = '<View Scope="RecursiveAll"><Query><OrderBy><FieldRef Name="FileDirRef" /></OrderBy></Query></View>';
  camlQuery.set_viewXml(query);
  
  this.collListItem = oList.getItems(camlQuery);

  //Only get the title and path information in order to reduse size of returned data
  clientContext.load(collListItem, 'Include(Title,FileDirRef,FileLeafRef)');

  clientContext.executeQueryAsync(
   Function.createDelegate(this, this.onListItemsQuerySucceeded), 
   Function.createDelegate(this, this.onListItemsQueryFailed)
  );
 }

 function onListItemsQuerySucceeded() {
  var titles = '';
  var listEnumerator = this.collListItem.getEnumerator();
  var loopCount = 1;
  var prevFolder = 'zzz';
  while (listEnumerator.moveNext()) {
   var oItem = listEnumerator.get_current();
   var fileName = oItem.get_item('FileLeafRef');
   var folderUrl = oItem.get_item('FileDirRef');
   //If the filename is longer than 63 characters, display a message
   if (fileName.length > 63) {
    if (prevFolder != folderUrl)
    { titles += '<div style="font-weight:bold;"><a href="' + hostweburl + folderUrl + '" target="_blank">' + folderUrl + '</a></div>'; }
    titles += '<div style="padding-left:10px;">' + fileName + '(length: ' + fileName.length + ')</div>';
    prevFolder = folderUrl;
   }
   loopCount++;
  }
  AddMessage(titles);
 }

 function onListItemsQueryFailed(sender, args) {
  alert('Request failed. ' + args.get_message() + '\n' + args.get_stackTrace());
 } 
</script>

<div id="results"></div>

Sunday, August 4, 2013

PowerShell to Find ListItems Modified Since a Date

Assuming that the list to be queried is already referenced in the $list object in script, here is a simple snippet to retrieve all list items that have been modified within the last one hundred days:

$previousRunDate = [Microsoft.SharePoint.Utilities.SPUtility]::CreateISO8601DateTimeFromSystemDateTime([DateTime]::Now.AddDays(-100)) $camlModifiedDateQuery = '<where><Gt><FieldRef Name="Modified" /><Value Type="DateTime">{0}</Value></Gt></Where>' -f $previousRunDate $query = New-Object Microsoft.SharePoint.SPQuery $query.Query = $camlQuery $listItems = $list.GetItems($query) $previousRunDate = [Microsoft.SharePoint.Utilities.SPUtility]::CreateISO8601DateTimeFromSystemDateTime([DateTime]::Now.AddDays(-100)) $camlModifiedDateQuery = '{0}' -f $previousRunDate $query = New-Object Microsoft.SharePoint.SPQuery $query.Query = $camlQuery $listItems = $list.GetItems($query)

User Profiles missing from SharePoint 2010 People Search Results

Pleasantly surprising how a small configuration change can correct seemingly unrelated errors - people search in a SharePoint 2010 site which I assist with was missing hundreds of people from the search results listing.

The search log was showing a large number of errors stating "Error in PortalCrawl Web Service" each associated with a URL for the person.aspx page with an accountname in the querystring. Clicking on any of these URLs from the crawl log gave a "User not found" error. An interesting thing in these was that the account names had the wrong delimiting character - they were showing domain/username rather than domain\username.

Also there was one top level error stating showing that the crawl of the sps3 source failed. This was due to the default content access account not having the necessary administrator permissions on the User Profile service application. Granting the "Retrieve People Data for Search Crawlers" permission for that account solved the Top Level error, and also had the effect of removing all the crawl log entries for the person.aspx crawl.

So the users are now showing in the search results, and everyone is happy!