SharePoint 2016: How To Get SPUser By User ID

While working with a whole range of applications using SharePoint Platform & Services I have come across lot of issues specially while dealing with SharePoint User Information let it by user names entered by application users on the UI or by querying user information by User ID.

In most scenarios you will need this to query data for the users based on User ID from within code itself. I have add a little User Interface in this article to make sure you get the idea on how this approach will work and what all properties will be exposed as results.

Here is the process flow that depicts the flow of information between Client Request & Server Response:

0

To start with the demo I have add some of HTML elements to prepare the UI with a textbox to enter User ID as shown below-

1

In the below screen shot you can see the simple HTML markup for the user interface.

We have a textbox where users can enter user id of the SharePoint User

2

In order to display the results I have add a HTML table as container. Purpose is to prepare HTML on the fly and paste it at runtime into this container.

3

In Step 1 we have bound the blur event of the textbox to a function that will execute the query against the User Data based on the User ID

4

In Step 2 we call another helper function “getUserById” by passing user id to it. This function call return a jQuery promise which can be further evaluated in upcoming steps

In Step 3 we will check if the JQuery call has been completed or not by using JQuery “when” construct.

In Step 4 we will call another helper function “renderUser” once the JQuery call has been completed in Step 3. The “renderUser” function is responsible to render User Information into the container.

5

In Step 5 we call “_api/Web” REST API endpoint can using its function “getUserById”, during this call we will specify “json” as datatype to ensure that we will get results in “json” format

6

In Step 6 we are rendering the details of the user in the container

7

And here is the final output of the operation performed.

So we can see Title, Login Name, Email returned back for a specific User Id as shown below-

8

That is all for this demo.

Hope you find it helpful.

 

SharePoint 2016: How to configure State Service Application using PowerShell

There are many SharePoint Components like InfoPath Form Services, Visio, Search Service, Workflows and many more, relies on SharePoint State Service Application to application sessions across related HTTP(S) requests in a SQL Server Database.

In this article, we will discuss the steps involved in configuring SharePoint State Service Application using PowerShell

First, let us go to Central Administration Site to make sure that State Service Application is not available to be provisioned from the UI.

Go to “Central Administration” -> “Application Management” and click on “Manage Service Application”

2

On Service Application Page, Click “New” Menu in Ribbon Bar to see list of Service Applications that we provision from this UI and we can see State Service Application is missing from this list

That means we can provision this Service Application only using PowerShell and that what we are going to do in this article.

3

In order to provision the service application using PowerShell, launch “SharePoint 2016 Management Shell” and perform the following steps-

1

4

Step 1: Provision Service Application Instance

In this step we will provision the State Service using “New-SPStateServiceApplication” cmdlet as shown below

New-SPStateServiceApplication -Name "State Service Application"

5

Step 2: Provision Service Application Proxy

In this step we will provision Service Application Proxy for State Service by using the following command

Get-SPStateServiceApplication| New-SPStateServiceApplicationProxy –defaultproxygroup

6

Step 3: Provision Database for State Service Application

In this Step, we will provision database for State Service Application using the following command. We choose “State_Service_Database” as database name

Get-SPStateServiceApplication| New-SPStateServiceDatabase -Name "State_Service_Database"

7

Step 4: Install the State Database Schema

In this step we will install the database schema into State Service Database by using following command

Get-SPDatabase | where-object {$_.type -eq "Microsoft.Office.Server.Administration.StateDatabase"} | Initialize-SPStateServiceDatabase

8

Step 5: Verify Service Application

In this step we will verify the provisioning of service application by looking at “Service Application” Page in “Central Administration” as shown below-

9

Step 6: Verify Service Application Database

In this step we will verify the provisioning of service application database by logging into “SQL Server Management Studio” as shown below-

1011

This concludes the Service Application Provisioning Process using PowerShell.

Hope you find it helpful.

SharePoint Developer Tools: How To Test & Debug SharePoint REST API Endpoints (POST Requests)

This is the second are article in the series of using Fiddler as Debugging & Testing Tool for SharePoint REST API EndPoints.

You can read the article on GET Request here:

SHAREPOINT DEVELOPER TOOLS: HOW TO TEST & DEBUG SHAREPOINT REST API ENDPOINTS (GET REQUESTS)

POST requests are different in nature than GET requests. They require more authentication layers to get through in order to push the data to SharePoint Lists and Libraries.

In order to run the POST request successfully we need an additional request header “X-RequestDigest” which is not but the User Authentication Token.

In order to request this token from SharePoint we need to make of “contextInfo” endpoint that will return the “FormDigestValue” containing the required user authentication token.

Now let see how we can request Authentication Token from SharePoint

Get Authorization Token

http://<Host Name>_api/contextinfo

1

2

Once we get the Authentication Token from SharePoint, we can add this token information in the Request Header of each of the POST requests

Request Headers

Accept: application/json;odata=verbose
Content-Type: application/json;odata=verbose
X-RequestDigest: 0xE1AE266A42214DA2940689826F68426D10620220CEDD3093CA2C234993E4ECA265BA57D357E8D3BD32F56660613CADBF72495F2C858B38F7C9B9C3CAD797F6D5,06 Feb 2017 01:22:08 -0000

Once we are ready with Request Headers we can start issuing POST Requests as shown below-

Add Data to List

Let’s consider we have a list called Categories as shown below-

3

First see the XML return based on querying schema for Categories List using following URL

http://<Host Name>/_api/Web/Lists/getByTitle('Categories')

4

Then we will see the XML return based on querying for Categories List Items using following URL

http://<Host Name>/_api/Web/Lists/getByTitle('Categories')/Items

5

Next step is to prepare the Request Body and we have to include following properties to add the items.

Please note that I am taking properties that are required for this list to add the category and add any desired number of properties to the Request Body as per the schema of the target list.

Request Body

"__metadata": { type: " SP.Data.CategoriesListItem" },
Title: "Category From Fiddler",
CategoryID: 9,
Description: “New Category Added from Fiddler”

6

Once we execute this request we can inspect the response to ensure that the request item has been added successfully to the Categories List.

7

Also we can validate this new item added by browsing Categories List

8

Update List Item

http://<Host Name>/_api/Web/Lists/getByTitle('Categories')/Items(9)

For update request you have to include “eTag” value that was returned with the item during the initial query to the Request Body. SharePoint uses this value to determine if there is any updates made to the item since it is last queried.

“If-Match: *” can be used to match any “eTag” value resulting in the operation being performed regardless of the actual value.

“X-Http-Method: PATCH” is to override the existing values

So the request body would be like this

IF-MATCH: *
X-Http-Method: PATCH
{
    "__metadata": {
    type: "SP.Data.CategoriesListItem"
},
Title: "Category From Fiddler - Updated",
Description: "New Category Added from Fiddler - Updated"
};

9

Once the request executed successfully we can see the item is updated in the Categories List

10

Delete List Item

http://<Host Name>/_api/Web/Lists/getByTitle('Categories')/Items(9)

Delete operation is more or less similar to Update operations.

11

In case of delete we will use of “X-Http-Method: DELETE” in the Request Body

Request Body

IF-MATCH: *
X-Http-Method: DELETE

12

Once the request executed successfully we can validate the item is deleted from the list.

13

Add New List

http://<Host Name>/_api/Web/Lists

Adding a new SharePoint List involve a little bit more of configuration information in Request body apart from request headers

Request Headers

Accept: application/json;odata=verbose
Content-Type: application/json;odata=verbose

Request Body

Content-Length: 0
{
"__metadata": { type: "SP.List" },
"AllowContentTypes":true,
"ContentTypesEnabled":true,
"Description":"This is Task List Created using Fiddler",
"BaseTemplate": 107,
"Title": "Task List By Fiddler"
}

14

Once this request has been executed successfully we can see the Response Body holding information about newly added SharePoint List

15

Also we can see this new list added to SharePoint by browsing the respective site

16

17

Also we can verify the “AllowContentTypes” & “ContentTypesEnabled” properties are configured as expected by browsing the Advanced Properties of the new List as shown below-

18

Delete List

http://<Host Name>/_api/Web/Lists/getByTitle('Task%20List%20By%20Fiddler')

Deleting a list is rather simpler than adding it. It takes “X-Http-Method: DELETE” to be added to the request header and rest will be done for you.

 Request Headers

Accept: application/json;odata=verbose
Content-Type: application/json;odata=verbose
Content-Length: 0
IF-MATCH: *
X-Http-Method: DELETE

19

Once the request has been completed, it will delete the required list from SharePoint Lists Collection.

20

Hope you find it helpful.

SharePoint Developer Tools: How to Test & Debug SharePoint REST API Endpoints (GET Requests)

In this article we will understand how utilize a famous developer productivity tool called fiddler as REST API Test Client for SharePoint (though the target system could be anything with a valid REST API Endpoint)

Fiddler is primarily used as a Web Proxy that can allow you intercept REST API Request – Response Cycle. The usage of this tool has increase with shift in modern SharePoint development paradigms that favors more if Client Side Development Techniques/Strategies/Platforms rather than traditional Farm Solutions.

In this upcoming section of this article I will guide on how to use Fiddler to test REST API Call against SharePoint Data.

In this article we will explore only GET type of Requests only.

To start with this demo launch Fiddler and go to “Rules” Menu and Select “Automatically Authenticate”, this will let Fiddler to authenticate you against SharePoint based on the User Token stored once.

1

If this setting is not enabled you might encounter “401 UNAUTHORIZED” as shown below-

2

Also notice the request headers that are required to execute the SharePoint REST API Endpoint

GET Requests

http://<Host Name>/_api/<SharePoint Endpoint>

Request Headers
Accept: application/json;odata=verbose
Content-Type: application/json;odata=verbose

Get Web Object

http://<Host Name>/_api/web

  • Click on “Compose” Tab
  • Select request type as “GET” from dropdown
  • Specify the Request URL as http://<Host Name>/_api/web
  • Click on “Execute” Button

3

Once the request is issued using Fiddler “Composer“, we can see the request details in the left pane

4

When you click on the request in the left pane we can see the details breakdown in the Right Pane

For instance we can click on “Inspectors” tab and then click on “JSON” tab.

JSON Tab will display the response received from SharePoint in JSON Format.

5

Similarly we can execute other GET Requests as shown in upcoming Screen Shots-

Get List Object

http://<Host Name>/_api/Web/Lists

6

7

Get Lists which are not hidden and have Items

http://<Host Name>/_api/Web/Lists?$select=Title,Hidden,ItemCount&orderby=ItemCount&$filter=((Hidden eq false) and (ItemCount gt 0))

Encoded Version of Request URL

http://<Host Name>/_api/Web/Lists?$select=Title,Hidden,ItemCount&orderby=ItemCount&$filter=((Hidden%20eq%20false)%20and%20(ItemCount%20gt%200))

8

9

Get Web filtered by Title

http://<Host Name>/_api/Web/?$select=Title

10

11

Get Web and Lists using Look Properties Expanding Lists Collection

 http://<Host Name>/_api/Web/?$select=Title,Lists/Title,Lists/Hidden,Lists/ItemCount&$expand=Lists

12

13

Get Web and Lists using Look Properties Expanding Users Collection

http://sp-2016-ddev/_api/Web/?$select=Title,CurrentUser/Id,CurrentUser/Title&$expand=CurrentUser/Id

14

15

That is all for this demo.

Hope you find it helpful.

SharePoint Online/2016/2013: How To Upload Large Files Using PowerShell Automation

Uploading large files to SharePoint On-Premise or Online is an obvious problem during data migration from any external systems like Lotus Notes.

Here is one of such errors which we might encounter while trying to upload a file of size greater than 250 MB-

1

In this article I will explain a data upload strategy where we can split a large file into multiple chunks of smaller size.

Solution Architecture Diagram

For better understanding we can refer to the following solution architecture diagram-

2

Based on this diagram we can conclude the following facts-
1. This solution can be hosted on multiple servers to launch parallel uploads
2. This solution can consume data from Network File Shares
3. Once data file is retrieved (say of size 300 MB), this solution will split the file (100 MB) automatically based on the pre-configured chunk size (which should not exceed the size limit of 250 MB)
4. Each chunk then appended to the file uploaded in multiple iterations

In order to start with this demo we would need a SharePoint Document Library in SharePoint Online (or On-Premise) Site as shown below-

3

Another prerequisite to demo is to have files of various sizes that we can use to upload to the document library.

I made use of following command line utility to generate files of various sizes. This utility takes destination folder path and size of the file in KBs as input.

Here is the usage example of the command line utility-

fsutil file createnew "C:\Prashant\Self Paced Training\Sample Files\2GB.txt" 2147483648

Similarly I have generated other files too as shown below-

4

Now let’s dive down into the code to understand the actual implementation.

Step 1: Declare a variables to hold the document library name & folder path. For production use I recommend to have these values in an external configuration file.

Step 2: Reading files from the folder specified in path variable in Step 1

Step 3: Loop through all the files and pass each file to the “UploadLargeFiles” function along with the document library name

5

Step 4: Generate unique upload id & get file name of the file to be uploaded

Step 5: Get handle on document library object and load the root folder (or any target folder) with in the document library

Step 6: Calculate the block size to be uploaded and total file size (as shown in the architecture diagram)

Step 7: Read the bytes from the source file and set the read buffer based on the block size

6

Step 8: Read the bytes based on the buffer limit that we set in earlier steps

7

Step 9: Check if this is the first chunk that is being uploaded, if yes then add a new file to SharePoint Document Library, get the file content based on the buffer size for the chunk and call “StartUpload” function that is defined under “Microsoft.SharePoint.Client.File” class. This will add the file to the document library but with small bunch of content only.

Step 10: Check if this is not the first chunk that is being uploaded, if yes then find the file in document library and get the handle on it

Step 11: If this is another chunk of data which is not the last chunk, this chunk will be appended to the same file by using “ContinueUpload” function that is defined under “Microsoft.SharePoint.Client.File” class. This will append the content to the file identified by Upload Id that we have initialized in earlier steps.

Step 12: If this is last chunk of data, this chunk will be appended to the same file by using “FinishUpload” function that is defined under “Microsoft.SharePoint.Client.File” class. This will append the content to the file identified by Upload Id that we have initialized in earlier steps and commits the changes to the file. Once this function completes successfully the changes will be made persistent to the file.

8

Step 13: Perform exception handling and call the “UploadLargeFileToLibrary”

9

I recommend to read the documentation on Microsoft.SharePoint.Client.File class and understand functions carefully before using it.

Once we execute this script we can see the following information-

  1. File Name to be uploaded
  2. Chunk size
  3. Total Time taken to upload the files

It is important to note that total time taken to upload the files may vary depending on the hosting environment.

File Size to be uploaded: 10 MB

10

File Size to be uploaded: 50 MB

11

File Size to be uploaded: 500 MB

12

File Size to be uploaded: 2 GB

13

Once the script executed successfully we can see the respective files uploaded to the SharePoint Online Site as shown below-

14

That is all for this demo.

This article is equally applicable for both SharePoint Online & On-Premise Versions.

Hope you find it helpful.

SharePoint 2013/2016: How to Find Duplicate Records in SharePoint List

During one of my assignments I have come across a situation where we need to fix data issues in SharePoint Lists.

One of the issues that we found was presence of duplicate data. In order to fix that problem in hand I had developed a Powershell Script to find out duplicate data based on a specific or a group of columns.

For the sake of demo, I have added a SharePoint List with some duplicate records in it as shown below:

1

Now let’s look into the code to understand implementation details-

In Step 1 we are getting references of the Site and Web where the SharePoint List resides

In Step 2 we are splitting the list of columns based on which we want to find out the duplicate data

We can see there are two input variables “ColumnToValidate” and “ColumnToDisplay”. “ColumnToValidate” provides columns based on which duplicity needs to be checked while “ColumnToDisplay” contains the list of columns that needs to be the part of data export.

In Step 3 we are creating the export folder that will hold the CSV files exported with duplicate records

In Step 4 we are creating the list object that will give the handle on the list which needs to be validated

2

In Step 5 we are getting list of Items from SharePoint List and grouping them based on the validation columns

In Step 6 we are creating the directory for export files

In Step 7 we are exporting all the groups which is having item count greater than 1 (this logic identifies the duplicate items)

3

That is all for the code.

Now we will see the variation in outputs depending on the columns specified for duplicacy check

In Step 8 we specify the validation and display columns, for the first execution we will check duplicate values in “Title” column

In Step 9 we are calling the “Get-DuplicateListItems” function to find the duplicate values

4

After the function executed successfully we can see the following output.

In Step 10 we can see the output of this excution and can see 6 items found which duplicate in Title Column

5

In Step 11 we can see the CSV file that is exported by the execution considering “Title” Column to be validated.

6

In Step 12 we can see the output file and can notice duplicate values in “Title” Column

7

In Step 13 we have changed the list of columns to be validated. In this second execution I have added another column “Role”.

Now the list will be validated for duplicity based on the combination of “Title & Role” Columns

8

In Step 14 we can see the output of this excution and can see 4 items found which duplicate in “Title & Role” Columns

9

In Step 15 we can see the CSV file that is exported by the execution considering “Title & Role” Column to be validated

10

In Step 16 we can see the output file and can notice duplicate values in “Title & Role” Column

11

In Step 17 we have changed the list of columns to be validated. In this second execution I have added another column “Location”.

Now the list will be validated for duplicity based on the combination of “Title & Role & Location” Columns

12

In Step 18 we can see the output of this excution and can see 2 items found which duplicate in “Title & Role & Location” Columns

13

In Step 19 we can see the CSV file that is exported by the execution considering “Title & Role & Location” Column to be validated

14

In Step 20 we can see the output file and can notice duplicate values in “Title & Role & Location” Column

15

This is a very simple technique that can be used to fix one of the issues with SharePoint List data.

Hope you find it helpful.

SharePoint 2016/2013/Online: How to Optimize SharePoint Custom Pages Using HTML5 IndexedDB API

In this article we will discuss another obvious performance issues with SharePoint Solutions involving large volume of data transactions surfacing SharePoint Custom Pages.

This could become more prominent if we have strict governance in place and we are not allowed to make use of advanced server side options (Custom Web Service End Point, MTA Enabled Modules etc.).

In one of the recent assignment I came across a similar scenario where I need to crawl data from an external Web Service end Point and surface data on SharePoint Pages. Since the anticipated data volume was huge and traditional caching approaches like Cookies wont’ work due to size limitations.

In pursuit of the solution I have gone through the “HTML5 Web Storage APIs” that allows you to setup an In-Browser Transactional Database System called “IndexedDB”.

Here is a quick introduction of IndexedDB for details I must recommend you to visit IndexedDB

“IndexedDB is a transactional database system, like an SQL-based RDBMS. However, unlike SQL-based RDBMSes, which use fixed-column tables, IndexedDB is a JavaScript-based object-oriented database. IndexedDB lets you store and retrieve objects that are indexed with a key; any objects supported by the structured clone algorithm can be stored. Operations performed using IndexedDB are done asynchronously, so as not to block applications.”

I also want to thanks to “Raymond Camden” for his detailed research on Storage Limits for IndexedDB and believe you must refer this link to understand the limits carefully before getting into concrete implementations.

Now let’s try to understand the implementation details by using following diagram:

Solution Architecture Diagram & Explanation

1

In this solution the SharePoint Page will try to look for the required data in Local Indexed DB created to support this page. If data is not found in local database, page will issue the request for data from SharePoint List.

Since we are dealing with “100,000” Items present in SharePoint List, I made use of “REST API + OData Continuation” data access strategy to overcome SharePoint List Threshold Limits. This mechanism will access only 100 List Items at a time and it is safe to extend this limit up to 2000 items per fetch.

Each fetch will a JSON Object that will be persisted into Indexed DB as individual record. I opt this strategy to reduce the page load time. If the number of items are not much you can add each item as separate record.

Every subsequent data call will be automatically diverted to the local database as primary source.

Additionally we can add “Auto Refresh Modules” to keep the local database fresh with SharePoint List Changes and sync the changes with Indexed DB “Asynchronously”.

Ideally speaking for a complete solution “Auto Refresh Modules” are must to have.

So this all about execution summary for this solution.

Now let’s have look at implementation details as follows-

I have created a SharePoint List with two columns and “100,000” Items added to it as shown below.

Demo

This list will be acting as data source for the page. In actual scenarios this source could be a Web Service End Point which can provide voluminous data on demand.

2

3

Before getting into code let’s see how this Page will behave on execution. Demonstrating the page in action will be helpful later when we get a deep dive in code.

If we run the page we will see this page took about “3 minutes” to get execution completed.

The first execution cycle will include the following actions:

  1. Initialize IndexedDB Database
  2. Query SharePoint List
  3. Add REST API Response to IndexedDB
  4. Load page with data from IndexedDB

Since we are adding data to the store asynchronously, overall application will remain functional even it is taking 3 minutes to complete.

4

Following screen shot showing data adding to IndexedDB asynchronously

5

We can also review the Indexed DB initialized as the part of this request using “Developer Tools or F12 Key” with in the browser as shown below-

6

We can explore each item in the each of the JSON Object as shown below-

7

Now refresh the page to see the execution again and we can see roughly “1 second” to complete the page request.

The subsequent execution cycle will include the following actions:

  1. Query IndexedDB for data
  2. Load page with data from IndexedDB

So we can see how we can trim the execution path by using a well-defined strategy.

8

Code Analysis

Let’s do the code analysis to understand the concrete implementation.

In Step 1 we are enclosing some of the literals as variables and will refer theses variables later in the code

9

In Step 2 we are checking if respective Indexed Database is initialized already or not and if not Initialize the Database. In this demos let’s call this database as “Products”

10

In Step 3 “onsuccess” event handler will get executed and database object will get stored in a global variable “SharePointOptimization.sharePointStore”. This variable will be acting as start point for all the operations on the database in future.

In Step 4 default error handling module is assigned as callback function to “onerror”, “onblocked”, “onabort” event handler

11

In Step 5 we are querying SharePoint List using REST API

12

In Step 6 we are making use of OData Continuation Techniques to overcome SharePoint List Threshold restrictions.

In this step we also call “AddDataToStore” function that will add SharePoint List Items coming as JSON Object to the Local Indexed Database.  It is important to recall that in this demo I am storing 1 JSON Object as 1 record in database and each object contains information for 100 List Items.

13

In Step 7 we are adding JSON Objects to IndexedDB. In order to do that we need to perform following operations-

  • Initialize Transaction with Read Write Operation Permissions
  • Get Handle on “Products” Database inside IndexedDB Data Stores
  • Call asynchronous “add” method to add JSON Object to “Products” Store

In Step 8 we are calling “QuerySharePoint” function to query data from SharePoint List in case data is not available in Local Database.

14

Steps 9, 10, 11 explains about “ReadSPStore” function where we will read the data from Local Data Store (IndexedDB)

In Step 9 following operations are performed-

  • Initialize Transaction with Read Operation Permissions
  • Get Handle on “Products” Database inside IndexedDB Data Stores
  • Call asynchronous “count” method to get total number of JSON Object available in “Products” Store

In Step 10 following operations are performed-

  • Check for get count request status
  • If success Initialize Indexed DB Cursor by calling asynchronous “openCursor” function

In Step 11 following operations are performed-

  • Check for get cursor request status
  • If success read the record from IndexedDB and add to the local array variable
  • Call “continue” function as long as there are items left in local store
  • Once all data is read and save to the local array pass this array to “RenderUI” function to render this data on the interface as required

15

In Step 12 we can plug any UI engine to produce more intuitive UI as applicable, for the sake of this demo I am writing out the Count of Store records * 100 (since each record contains 100 Items) to show the total number of items stored in the local store.

16

Steps 13, 14, 15 show you a helper function to check if local store contains required data or not. It helps to decide if we need to read data from Local Store or SharePoint List

“GetProductCount” function is quite similar to the “ReadSPStore” function except it perform a lesser number of operations

17

In Step 16 we will initialize Local SharePoint Store by calling “InitializeSharePointStore” function

18

In Step 17 we can see some of the UI elements to build a basic UI for this demo

19

Point of caution

Before implementing this mechanism make sure you have identified all the compatibility issues around this corner.

I would recommend you to refer the following site every now and then to make sure you are using features supported by the targeted browsers.

http://caniuse.com/#search=IndexedDb

20

Since I have made use of artifacts which are compatible with SharePoint Online Development Guidelines so we can use this approach with pages hosted in SharePoint Online as well.

That is all for this demo.

Hope you find it helpful.