SharePoint Developer Tools: How to Test & Debug SharePoint REST API Endpoints (GET Requests)

In this article we will understand how utilize a famous developer productivity tool called fiddler as REST API Test Client for SharePoint (though the target system could be anything with a valid REST API Endpoint)

Fiddler is primarily used as a Web Proxy that can allow you intercept REST API Request – Response Cycle. The usage of this tool has increase with shift in modern SharePoint development paradigms that favors more if Client Side Development Techniques/Strategies/Platforms rather than traditional Farm Solutions.

In this upcoming section of this article I will guide on how to use Fiddler to test REST API Call against SharePoint Data.

In this article we will explore only GET type of Requests only.

To start with this demo launch Fiddler and go to “Rules” Menu and Select “Automatically Authenticate”, this will let Fiddler to authenticate you against SharePoint based on the User Token stored once.

1

If this setting is not enabled you might encounter “401 UNAUTHORIZED” as shown below-

2

Also notice the request headers that are required to execute the SharePoint REST API Endpoint

GET Requests

http://<Host Name>/_api/<SharePoint Endpoint>

Request Headers
Accept: application/json;odata=verbose
Content-Type: application/json;odata=verbose

Get Web Object

http://<Host Name>/_api/web

  • Click on “Compose” Tab
  • Select request type as “GET” from dropdown
  • Specify the Request URL as http://<Host Name>/_api/web
  • Click on “Execute” Button

3

Once the request is issued using Fiddler “Composer“, we can see the request details in the left pane

4

When you click on the request in the left pane we can see the details breakdown in the Right Pane

For instance we can click on “Inspectors” tab and then click on “JSON” tab.

JSON Tab will display the response received from SharePoint in JSON Format.

5

Similarly we can execute other GET Requests as shown in upcoming Screen Shots-

Get List Object

http://<Host Name>/_api/Web/Lists

6

7

Get Lists which are not hidden and have Items

http://<Host Name>/_api/Web/Lists?$select=Title,Hidden,ItemCount&orderby=ItemCount&$filter=((Hidden eq false) and (ItemCount gt 0))

Encoded Version of Request URL

http://<Host Name>/_api/Web/Lists?$select=Title,Hidden,ItemCount&orderby=ItemCount&$filter=((Hidden%20eq%20false)%20and%20(ItemCount%20gt%200))

8

9

Get Web filtered by Title

http://<Host Name>/_api/Web/?$select=Title

10

11

Get Web and Lists using Look Properties Expanding Lists Collection

 http://<Host Name>/_api/Web/?$select=Title,Lists/Title,Lists/Hidden,Lists/ItemCount&$expand=Lists

12

13

Get Web and Lists using Look Properties Expanding Users Collection

http://sp-2016-ddev/_api/Web/?$select=Title,CurrentUser/Id,CurrentUser/Title&$expand=CurrentUser/Id

14

15

That is all for this demo.

Hope you find it helpful.

Advertisements

SharePoint Online/2016/2013: How To Upload Large Files Using PowerShell Automation

Uploading large files to SharePoint On-Premise or Online is an obvious problem during data migration from any external systems like Lotus Notes.

Here is one of such errors which we might encounter while trying to upload a file of size greater than 250 MB-

1

In this article I will explain a data upload strategy where we can split a large file into multiple chunks of smaller size.

Solution Architecture Diagram

For better understanding we can refer to the following solution architecture diagram-

2

Based on this diagram we can conclude the following facts-
1. This solution can be hosted on multiple servers to launch parallel uploads
2. This solution can consume data from Network File Shares
3. Once data file is retrieved (say of size 300 MB), this solution will split the file (100 MB) automatically based on the pre-configured chunk size (which should not exceed the size limit of 250 MB)
4. Each chunk then appended to the file uploaded in multiple iterations

In order to start with this demo we would need a SharePoint Document Library in SharePoint Online (or On-Premise) Site as shown below-

3

Another prerequisite to demo is to have files of various sizes that we can use to upload to the document library.

I made use of following command line utility to generate files of various sizes. This utility takes destination folder path and size of the file in KBs as input.

Here is the usage example of the command line utility-

fsutil file createnew "C:\Prashant\Self Paced Training\Sample Files\2GB.txt" 2147483648

Similarly I have generated other files too as shown below-

4

Now let’s dive down into the code to understand the actual implementation.

Step 1: Declare a variables to hold the document library name & folder path. For production use I recommend to have these values in an external configuration file.

Step 2: Reading files from the folder specified in path variable in Step 1

Step 3: Loop through all the files and pass each file to the “UploadLargeFiles” function along with the document library name

5

Step 4: Generate unique upload id & get file name of the file to be uploaded

Step 5: Get handle on document library object and load the root folder (or any target folder) with in the document library

Step 6: Calculate the block size to be uploaded and total file size (as shown in the architecture diagram)

Step 7: Read the bytes from the source file and set the read buffer based on the block size

6

Step 8: Read the bytes based on the buffer limit that we set in earlier steps

7

Step 9: Check if this is the first chunk that is being uploaded, if yes then add a new file to SharePoint Document Library, get the file content based on the buffer size for the chunk and call “StartUpload” function that is defined under “Microsoft.SharePoint.Client.File” class. This will add the file to the document library but with small bunch of content only.

Step 10: Check if this is not the first chunk that is being uploaded, if yes then find the file in document library and get the handle on it

Step 11: If this is another chunk of data which is not the last chunk, this chunk will be appended to the same file by using “ContinueUpload” function that is defined under “Microsoft.SharePoint.Client.File” class. This will append the content to the file identified by Upload Id that we have initialized in earlier steps.

Step 12: If this is last chunk of data, this chunk will be appended to the same file by using “FinishUpload” function that is defined under “Microsoft.SharePoint.Client.File” class. This will append the content to the file identified by Upload Id that we have initialized in earlier steps and commits the changes to the file. Once this function completes successfully the changes will be made persistent to the file.

8

Step 13: Perform exception handling and call the “UploadLargeFileToLibrary”

9

I recommend to read the documentation on Microsoft.SharePoint.Client.File class and understand functions carefully before using it.

Once we execute this script we can see the following information-

  1. File Name to be uploaded
  2. Chunk size
  3. Total Time taken to upload the files

It is important to note that total time taken to upload the files may vary depending on the hosting environment.

File Size to be uploaded: 10 MB

10

File Size to be uploaded: 50 MB

11

File Size to be uploaded: 500 MB

12

File Size to be uploaded: 2 GB

13

Once the script executed successfully we can see the respective files uploaded to the SharePoint Online Site as shown below-

14

That is all for this demo.

This article is equally applicable for both SharePoint Online & On-Premise Versions.

Hope you find it helpful.

SharePoint 2013/2016: How to Find Duplicate Records in SharePoint List

During one of my assignments I have come across a situation where we need to fix data issues in SharePoint Lists.

One of the issues that we found was presence of duplicate data. In order to fix that problem in hand I had developed a Powershell Script to find out duplicate data based on a specific or a group of columns.

For the sake of demo, I have added a SharePoint List with some duplicate records in it as shown below:

1

Now let’s look into the code to understand implementation details-

In Step 1 we are getting references of the Site and Web where the SharePoint List resides

In Step 2 we are splitting the list of columns based on which we want to find out the duplicate data

We can see there are two input variables “ColumnToValidate” and “ColumnToDisplay”. “ColumnToValidate” provides columns based on which duplicity needs to be checked while “ColumnToDisplay” contains the list of columns that needs to be the part of data export.

In Step 3 we are creating the export folder that will hold the CSV files exported with duplicate records

In Step 4 we are creating the list object that will give the handle on the list which needs to be validated

2

In Step 5 we are getting list of Items from SharePoint List and grouping them based on the validation columns

In Step 6 we are creating the directory for export files

In Step 7 we are exporting all the groups which is having item count greater than 1 (this logic identifies the duplicate items)

3

That is all for the code.

Now we will see the variation in outputs depending on the columns specified for duplicacy check

In Step 8 we specify the validation and display columns, for the first execution we will check duplicate values in “Title” column

In Step 9 we are calling the “Get-DuplicateListItems” function to find the duplicate values

4

After the function executed successfully we can see the following output.

In Step 10 we can see the output of this excution and can see 6 items found which duplicate in Title Column

5

In Step 11 we can see the CSV file that is exported by the execution considering “Title” Column to be validated.

6

In Step 12 we can see the output file and can notice duplicate values in “Title” Column

7

In Step 13 we have changed the list of columns to be validated. In this second execution I have added another column “Role”.

Now the list will be validated for duplicity based on the combination of “Title & Role” Columns

8

In Step 14 we can see the output of this excution and can see 4 items found which duplicate in “Title & Role” Columns

9

In Step 15 we can see the CSV file that is exported by the execution considering “Title & Role” Column to be validated

10

In Step 16 we can see the output file and can notice duplicate values in “Title & Role” Column

11

In Step 17 we have changed the list of columns to be validated. In this second execution I have added another column “Location”.

Now the list will be validated for duplicity based on the combination of “Title & Role & Location” Columns

12

In Step 18 we can see the output of this excution and can see 2 items found which duplicate in “Title & Role & Location” Columns

13

In Step 19 we can see the CSV file that is exported by the execution considering “Title & Role & Location” Column to be validated

14

In Step 20 we can see the output file and can notice duplicate values in “Title & Role & Location” Column

15

This is a very simple technique that can be used to fix one of the issues with SharePoint List data.

Hope you find it helpful.

SharePoint 2016/2013/Online: How to Optimize SharePoint Custom Pages Using HTML5 IndexedDB API

In this article we will discuss another obvious performance issues with SharePoint Solutions involving large volume of data transactions surfacing SharePoint Custom Pages.

This could become more prominent if we have strict governance in place and we are not allowed to make use of advanced server side options (Custom Web Service End Point, MTA Enabled Modules etc.).

In one of the recent assignment I came across a similar scenario where I need to crawl data from an external Web Service end Point and surface data on SharePoint Pages. Since the anticipated data volume was huge and traditional caching approaches like Cookies wont’ work due to size limitations.

In pursuit of the solution I have gone through the “HTML5 Web Storage APIs” that allows you to setup an In-Browser Transactional Database System called “IndexedDB”.

Here is a quick introduction of IndexedDB for details I must recommend you to visit IndexedDB

“IndexedDB is a transactional database system, like an SQL-based RDBMS. However, unlike SQL-based RDBMSes, which use fixed-column tables, IndexedDB is a JavaScript-based object-oriented database. IndexedDB lets you store and retrieve objects that are indexed with a key; any objects supported by the structured clone algorithm can be stored. Operations performed using IndexedDB are done asynchronously, so as not to block applications.”

I also want to thanks to “Raymond Camden” for his detailed research on Storage Limits for IndexedDB and believe you must refer this link to understand the limits carefully before getting into concrete implementations.

Now let’s try to understand the implementation details by using following diagram:

Solution Architecture Diagram & Explanation

1

In this solution the SharePoint Page will try to look for the required data in Local Indexed DB created to support this page. If data is not found in local database, page will issue the request for data from SharePoint List.

Since we are dealing with “100,000” Items present in SharePoint List, I made use of “REST API + OData Continuation” data access strategy to overcome SharePoint List Threshold Limits. This mechanism will access only 100 List Items at a time and it is safe to extend this limit up to 2000 items per fetch.

Each fetch will a JSON Object that will be persisted into Indexed DB as individual record. I opt this strategy to reduce the page load time. If the number of items are not much you can add each item as separate record.

Every subsequent data call will be automatically diverted to the local database as primary source.

Additionally we can add “Auto Refresh Modules” to keep the local database fresh with SharePoint List Changes and sync the changes with Indexed DB “Asynchronously”.

Ideally speaking for a complete solution “Auto Refresh Modules” are must to have.

So this all about execution summary for this solution.

Now let’s have look at implementation details as follows-

I have created a SharePoint List with two columns and “100,000” Items added to it as shown below.

Demo

This list will be acting as data source for the page. In actual scenarios this source could be a Web Service End Point which can provide voluminous data on demand.

2

3

Before getting into code let’s see how this Page will behave on execution. Demonstrating the page in action will be helpful later when we get a deep dive in code.

If we run the page we will see this page took about “3 minutes” to get execution completed.

The first execution cycle will include the following actions:

  1. Initialize IndexedDB Database
  2. Query SharePoint List
  3. Add REST API Response to IndexedDB
  4. Load page with data from IndexedDB

Since we are adding data to the store asynchronously, overall application will remain functional even it is taking 3 minutes to complete.

4

Following screen shot showing data adding to IndexedDB asynchronously

5

We can also review the Indexed DB initialized as the part of this request using “Developer Tools or F12 Key” with in the browser as shown below-

6

We can explore each item in the each of the JSON Object as shown below-

7

Now refresh the page to see the execution again and we can see roughly “1 second” to complete the page request.

The subsequent execution cycle will include the following actions:

  1. Query IndexedDB for data
  2. Load page with data from IndexedDB

So we can see how we can trim the execution path by using a well-defined strategy.

8

Code Analysis

Let’s do the code analysis to understand the concrete implementation.

In Step 1 we are enclosing some of the literals as variables and will refer theses variables later in the code

9

In Step 2 we are checking if respective Indexed Database is initialized already or not and if not Initialize the Database. In this demos let’s call this database as “Products”

10

In Step 3 “onsuccess” event handler will get executed and database object will get stored in a global variable “SharePointOptimization.sharePointStore”. This variable will be acting as start point for all the operations on the database in future.

In Step 4 default error handling module is assigned as callback function to “onerror”, “onblocked”, “onabort” event handler

11

In Step 5 we are querying SharePoint List using REST API

12

In Step 6 we are making use of OData Continuation Techniques to overcome SharePoint List Threshold restrictions.

In this step we also call “AddDataToStore” function that will add SharePoint List Items coming as JSON Object to the Local Indexed Database.  It is important to recall that in this demo I am storing 1 JSON Object as 1 record in database and each object contains information for 100 List Items.

13

In Step 7 we are adding JSON Objects to IndexedDB. In order to do that we need to perform following operations-

  • Initialize Transaction with Read Write Operation Permissions
  • Get Handle on “Products” Database inside IndexedDB Data Stores
  • Call asynchronous “add” method to add JSON Object to “Products” Store

In Step 8 we are calling “QuerySharePoint” function to query data from SharePoint List in case data is not available in Local Database.

14

Steps 9, 10, 11 explains about “ReadSPStore” function where we will read the data from Local Data Store (IndexedDB)

In Step 9 following operations are performed-

  • Initialize Transaction with Read Operation Permissions
  • Get Handle on “Products” Database inside IndexedDB Data Stores
  • Call asynchronous “count” method to get total number of JSON Object available in “Products” Store

In Step 10 following operations are performed-

  • Check for get count request status
  • If success Initialize Indexed DB Cursor by calling asynchronous “openCursor” function

In Step 11 following operations are performed-

  • Check for get cursor request status
  • If success read the record from IndexedDB and add to the local array variable
  • Call “continue” function as long as there are items left in local store
  • Once all data is read and save to the local array pass this array to “RenderUI” function to render this data on the interface as required

15

In Step 12 we can plug any UI engine to produce more intuitive UI as applicable, for the sake of this demo I am writing out the Count of Store records * 100 (since each record contains 100 Items) to show the total number of items stored in the local store.

16

Steps 13, 14, 15 show you a helper function to check if local store contains required data or not. It helps to decide if we need to read data from Local Store or SharePoint List

“GetProductCount” function is quite similar to the “ReadSPStore” function except it perform a lesser number of operations

17

In Step 16 we will initialize Local SharePoint Store by calling “InitializeSharePointStore” function

18

In Step 17 we can see some of the UI elements to build a basic UI for this demo

19

Point of caution

Before implementing this mechanism make sure you have identified all the compatibility issues around this corner.

I would recommend you to refer the following site every now and then to make sure you are using features supported by the targeted browsers.

http://caniuse.com/#search=IndexedDb

20

Since I have made use of artifacts which are compatible with SharePoint Online Development Guidelines so we can use this approach with pages hosted in SharePoint Online as well.

That is all for this demo.

Hope you find it helpful.

 

Developer’s Tools: How To Generate Basic Authentication Token

This demo is about another tool that I worked out during an assignment while working with an integration scenario using web services supporting Basic Authentication.

Problem with basic authentication is that you have to keep username and password stored somewhere in order to generate the authentication token.

Since I was integrating external web services with SharePoint so I felt to delegate the Token Generation Process to an external tool and consume the Authentication Token directly with out keep user name and password to be stored in the code itself.

To run this demo I have created a simple interface that takes username and password as input and generates the Basic Authentication Token and will display it in “Authentication Token” section

1

Now let’s discuss the code behind this functionality:

Step 1 is registering button click event by mapping a function “get Token”

Step 2 is calling “getBasicAuthenticationToken” function by passing user name and password

Step 3 is preparing format required to convert plain text into hash value in base-64 format

Step 4 is calling JavaScript function “btoa” which will encode the plain text into “base-64” format which is a required hash value to prepare Basic Authentication Token.

“Basic” is prefixed to the hash value to comply with Basic Authentication Token Standards

2

That is it for the code.

Now when we click “Get Authentication Token” button we will see the authentication token in the “Authentication Token” section.

3

This token can be used for any Web Service supporting basic authentication, and this strategy can be merged with other functionalities too in order to generate this token on the fly.
Hope you find it helpful.

SharePoint 2016/2013/Online- How to Apply Password Encryption for Component as Service using PowerShell

Recently I have developed a couple of PowerShell based components that will serve as data crawlers for federated data sources like External Web Services, SQL Server Databases, and Excel Workbooks & SharePoint Lists.

In order to authenticate the Service Accounts against all of these sources I had no choice but to embed the User Name and Passwords with in the PowerShell Code in plain text. It gets even worst when few of the Web Services could support only “Basic Authentication”.

Saving passwords in plain text to code files could lead us to the Compliance Issues and could get the solutions rejected eventually.

In order to fix this issue I have implemented a couple of mechanism to deal with each type of Authentication requirements.

In this article I will discuss the mechanism to authenticate the requests to SharePoint Lists.

In order to simplify this demo let’s consider a simple scenario where I am having a list “MyLocations” as shown below and I need to export its metadata using a PowerShell based component.

1

To keep the content crisp I will walk you through the specific section from code and skipping all the CSOM specific code which you can refer in my earlier articles if you like.

I have intentionally divided this implementation into two separate code files in order to keep the passwords safe from the developers. Intent is to get the Encryption File generated by the SharePoint Admins and provided these files to developers for so that they can use it in code directly as shown below.

In the following code snippet you can see the commands to encrypt password “12345678” and export it to a text file “BANSALP.txt”

2

This file would look like as shown below:

34

This way you can store passwords for all required service accounts in different text files without violating Security Compliance.

Now in order to pass this encrypted password to SharePoint for authentication we can make use of “System.Management.Automation.PSCredential” Class as shown below.

Here “Get-Content” Command let is used to read the content from “BANSALP.txt” file and “ConvertTo-SecureString” Command let to get the encrypted password as secure string

5

Once credential Object has been created we can assign this credential object to SharePoint Client Context “Credentials” Property

6

With this Client Context SharePoint Authenticates the incoming request based on the ACL of the requestor

Following is the outcome of the call that we have send to SharePoint:

7

I have exported the metadata to a “csv” file as well that would look like this.

8

Hope you find it helpful.

SharePoint(2016/2013) – Tableau Sync Manager

In the previous article SharePoint 2016/2013: OData Connector for Tableau Reports, I had explained you the implementation details of connecting SharePoint with Tableau using Odata Connections.

Since this approach was not confident and was not working for me as expected I have developed another reusable Add-in to achieve data sync between SharePoint & Tableau.

I call this Add-in as “SharePoint –Tableau Sync Manager”. To understand this Add-in better you can refer to the below Technical Diagram section

Technical Diagram

1

Following is the brief description of each of the components of this add-in-

Task Scheduler: This component is responsible to execute Sync Service at a defined frequency.

Sync Service: This component will perform following operations

  1. Provision staging database, though this is optional and can be done directly at database level
  2. Query SharePoint using CSOM/REST API End Points and sync it with staging database which is a SQL Server based database
  3. Generate logs with differential changes
  4. Send Email Notifications to Tableau Report Administrator/Owners

Analytics Staging Database: This database will store the data retrieved from SharePoint and act as primary data source for Tableau Report.

Tableau Report: This could be any Tableau Report based on the query from staging database.

Demo

In order to setup this demo, I have created a SharePoint List “MyLocations” that will hold locations data as shown below:

2

Staging database is provisioned with a Table “My_Locations” that is having corresponding columns to store data from SharePoint as shown below

3

Once database has been created we can write any required SQL query to fetch the data. In this case I have used a simple select statement to fetch all the data from the table

4

Now lets’ look into the Sync Service code that will talk to SharePoint using CSOM/ REST API End Points

Step 1 involves connecting to SharePoint List by using usual PowerShell CSOM technique

5

Step 2 involves connecting to staging database and deleting the existing content from the “My_Locations” Table

6

Step 3 involves reading data from SharePoint List and inserting it into staging table

7

Step 4 involves an exception handler that will send notifications to the process administrators in case if any error occurred during the Sync Process

8

Additionally we can generate differential logs and success notification to the report owners or may extend this layer to connect with other sources as well.

So that is all for the code.

Once it gets executed successfully we can see data has been synced from SharePoint to Staging Database

9

Go to database run the select query again to see if data synched successfully

10

Once data source is ready we can start designing the Tableau Report using Tableau Desktop by taking “Microsoft SQL Server” as connection.

On connecting with SQL Server specify the query to fetch the data for the report as shown below:

11

Once data connection is successful we can see data surfacing to Tableau Designer

12

Now you can design the report of your choice based on this data, for this demo I am presenting information as Geo Map which is most suitable for the kind of data that we have in “MyLocations” Lists in SharePoint

13

Based on this report I have added a dashboard and publish this dashboard to Tableau Server

14

This will be the final look of Tableau Report executing in browser

15

Now lets’ consider that we need to add another location type and to do so follow the Steps below:

  1. Access to SharePoint Site
  2. Add List Item to “MyLocations” List as highlighted below

16

Once data has been added/updated to SharePoint, Sync Manager will pick-out all the changes and sync back to Staging Database as per defined schedule

17

Once Sync Manager executes successfully, just refresh the report by using “Refresh” button on the report

18

And sure enough you will see the changes reflected on the report

19

We can extend this Sync Manger to cater even more advanced scenarios, which I may cover in some of the upcoming articles.

That is all for this demo.