SharePoint 2010 Fast Disk Space Part 2 (Solution)

Plan

The solution for the issue would be to stop the service and clear the logs. Start the service back try to process the old click analysis. It should typically process those files and improve the link rankings.

Note: This will have an impact on the search results proceed with caution.

Backup

Backup the following locations to a different server or a drive.

:\FASTSearch\var\log\sprel\

:\FASTSearch\data\sprel\ (all folder below this)

:\FASTSearch\components\resourcestore\generic\clickthrough

Execution

Stop the sprel process deletes the logs, start the service back and run the SPRel process cmd and ensure everything works as expected.Test all the process in a Test environment before trying it in a live environment.

  1. Run the following command to stop the sprel process:
nctrl stop sprel walinkstorerreceiver
  1. Clean the data/sprel directory (Except config folder)
  2. Run the following command to start the sprel process again:
nctrl start sprel walinkstorerreceiver
  1. Confirm everything is ok by running the following command
spreladmin ShowStatus
  1. Start the processing by executing the following command
spreladmin StartProcessing
  1. Regularly check the status by executing the following command
spreladmin ShowStatus
  1. Once completed, status shows ready doing nothing: ensure the process is completed

after

Update

we faced issue while incremental crawl. The following error popped up and due to this, some search results were affected.

The Content Plugin received a "Processing Error" response from the backend server
for the item. ( ERROR: PostProcessBatch failure in processor SPCIDStorer. 
Traceback (most recent call last): 
\\ File "SPCIDStorer.py", line 63, in PostProcessBatch \\ File "TableSender.py", 
line 111, in send \\ WindowsError: [Error 3] The system cannot find the path 
specified: 'E:\\FASTSE~1\\data\\sprel\\cidstorer_backup/*.*' \\ [17]

Fix: Create an empty folder :\FASTSearch\data\sprel\cidstorer_backup

Results

The expectation is the logs shouldn’t grow at a rapid pace, the clickthrough archive should be processed and important the search service shouldn’t see any outage.drive>:\FASTSearch\components\resourcestore\generic\clickthrough (data pushed by the SP extraction job) should be empty if the process worked as expected. Monitor the process for a couple of days and ensure there are no errors.

Check the following location should be empty if the process worked as expected.

:\FASTSearch\components\resourcestore\generic\clickthrough
(data pushed by the SP extraction job)

Start an incremental crawl to ensure there is no impact on the results. Monitor the process for a week and ensure there are no errors in the sprel logs or crawl logs.

Ref Links:

https://blogs.msdn.microsoft.com/microsoft_search_bloggers/2015/02/10/sprel-consuming-more-disk-space-and-spreladmin-status-not-showing-as-ready/

https://social.technet.microsoft.com/Forums/exchange/en-US/a76fad93-35a4-4de9-8662-c4c7f8b24225/sprelexe-high-utilization?forum=fastsharepoint

http://blog.comperiosearch.com/blog/2011/07/12/learning-about-nctrl-disabling-fast-search-web-crawler/

https://technet.microsoft.com/en-us/library/ee943516(v=office.14).aspx

 

SharePoint 2010 Fast Disk Space Part 1 (Issue)

I recently faced disk space issue on SharePoint Fast 2010 Crawl server. We need to find out what’s consuming the space. If it is the index or any core components we don’t have an option other than increasing the space.

Investigate DiskSpace

Let’s begin the journey with identifying the files, we will use a simple tool (windirstat) to find the space consuming files.

windirstat

The initial results were showing that the core components are not the issue and all that’s consuming is the logs. I was surprised to see a SPRel log that was 27GB and growing in standard phase, Next big thing was query logs. we have a PowerShell script that cleans up the query logs that are older than 60 days.

  1. FASTSEARCH\var\log\querylogs\  (Safely can be cleaned up)
  2. FASTSEARCH\var\log\sprel\
  3. FASTSEARCH\data\sprel\worker

We realized something is going wrong with the SPRel process. Just to give you an idea of the SPRel service.

“SPRel is a clickthrough log analysis engine that improves search results relevancy by analyzing the entries that users click on in search result sets. In FAST Search server 2010, we can use spreladmin.exe to configure SPRel, to schedule the clickthrough log analyses, and to retrieve status information”

Investigate SPRel

Check if the service is running as expected. Command to check the status should be executed in command prompt

<drive>:FastSearch\Bin\spreladmin showstatus

before

The process is actually stuck and not working as expected. This has stopped working in 2013 and till now in 2016. I think we are in the right direction. Now we have to analyze the log to find the reasons. Here comes another problem log is too huge to open it in notepad(27GB). It’s a Handy tool that saved my life Large Text File Viewer no installation required doesn’t crash your CPU it lazy loads the file with continuous query mode.

Logs

The possible failures found in the log, Disk Space

Disk Space wasn’t enough to process

Impossible to schedule more targets (timeout 60s). 4 ready targets:

make_1032.urls_by_urlid.1_b1. Target was never started. Too large output for the workers. Input size : 153196411

make_1032.urls_by_urlid.3_b3. Target was never started. Too large output for the workers. Input size : 153952094

make_1032.urls_by_urlid.0_b0. Target was never started. Too large output for the workers. Input size : 153724399

make_1032.urls_by_urlid.2_b2. Target was never started. Too large output for the workers. Input size : 153472337

— stopping build.

The Server didn’t respond and it has failed.

systemmsg Reset of analysis completed.
systemmsg Analysis failed 10 times, will retry from start.
systemmsg sprelrsclient failed. stdout: Connected to ResourceStore stderr: Could not list resources.. The remote server returned an error: (500) Internal Server Error.

I will take some time to figure out the solution and will discuss in the next post.

Using jQuery with custom Web services in SharePoint

I had an requirement of using  jQuery for fetching data from my custom web service. The approach was adopted to have better UI experience,

So the user doesn’t feel the post back and flickering of screens.

I faced lot of issues to make it work, with all the note points found by goggling J made it work like charm.

The approach as follows

Step 1. Create a web service

Step 2. Client Side JavaScript definition to consume the web service

Step 1: Create web service

We will create a custom web service with a web method HelloWorld that takes a name as parameter and returns back a string in JSON format.

JSONJava Script Object Notation more info

It’s the same as C# objects. Object.Property to access the value.

Create a new web service project in ASP.NET and the web service definition as follows

namespace WebService1

{

    using System.ComponentModel;

    using System.Web.Services;

    using System.Web.Script.Services;

    using System.Web.Script.Serialization;

    ///<summary>

    /// Summary description for Service1

    ///</summary>

    [WebService(Namespace = http://tempuri.org/)]

    [ScriptService]

    [WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]

    [ToolboxItem(false)]

    public class Service1 : System.Web.Services.WebService

    {

        ///<summary>

        /// Hello world

        ///</summary>

        ///<param name=”name”>Name</param>

        ///<returns>Hello name</returns>

        [WebMethod]

        [ScriptMethod(ResponseFormat = ResponseFormat.Json)]

        public string HelloWorld(string name)

        {

            responseObject = new

                {

                    MyName = “Hello “ + name

                };

            return new JavaScriptSerializer().Serialize(responseObject);

        }

    }

}

Note:

ü  [ScriptService] to be added to allow calls from java script

ü  [ScriptMethod(ResponseFormat = ResponseFormat.Json)] over the method definition to return JSON

SharePoint context

Deploy the web service to the 12 hive folder to be accessible for any user regardless of the permission rights.

The following handler has to be added to the web.config in the layouts folder to avoid the 500 internal server error that will occur when you call the service from jQuery.

<httphandlers>

    <add verb=”*” path=”*.asmx” validate=”false” type=”System.Web.Script.Services.ScriptHandlerFactory,

System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35″ />

</httphandlers>

Step 2: Client Side JavaScript definition

jQuery AJAX calls are very simple to use compared to the native AJAX calls.  The call is made in jQuery by calling the function with required parameters.

var Name=“Balaji”;

var servicePath = “/_layouts/1033/myservice/Service1.asmx”;

//the current site url + service url

var serviceURL = location.href.substring(0, location.href.lastIndexOf(‘/’)) + servicePath;

$.ajax({

        type: “POST”,

        contentType: “application/json; charset=utf-8”,

        url: serviceURL + “/HelloWorld”,

        data: ‘{“name”:”‘ + Name + ‘”}’,

        dataType: “json”,

        success: processResult,

        error: failure,

        crossDomain: true

    });

 

//Handle result

function processResult(xData) {

    var responseObject = jQuery.parseJSON(xData.d);

    alert(responseObject.MyName);

}

 

//Handle Failure

function failure(){

         alert(“Error occurred”);

}

Note:

ü  contentType and dataType to be defined as JSON

ü  crossDomain is set to true when you make cross domain queries like your app runs in a different domain and the service runs in another domain.

ü  The parameter passed to the webservice is case sensitive and the format should be JSON.

The parameter “name” in the web method HelloWorld is the same in the jQuery call “data” attribute.

data: ‘{“name”:”‘ + Name + ‘”}’

Hope this kick starts.

CAML Basics

CAML (Collaborative Application Markup Language)

The CAML queries are internally used to query the SharePoint List with specific conditions and formatting. It is very powerful as SQL queries and improves the performance. The CAML query can be split into

  • Display Part
  • Conditional Part

Display Part

They are generally used to display the content in a grouped and ordered manner.

Eg: OrderBy, GroupBy

<Query>
  <OrderBy>
    <FieldRef Name=’Name’/>
  </OrderBy>
  <GroupBy>
    <FieldRef Name=’Location’/>
  </GroupBy>
</Query>

Conditional Format

They are used to retrieve data with certain conditions. Like using logical operators. The following example queries will give an overview.

1.Simple Condition

The condition (Name ==”Balaji”) in CAML form would look like

<Where>
  <Eq>
    <FieldRef Name=’Name’/>
    <Value Type=’Text’>Balaji</Value>
  </Eq>
</Where>

2.Multiple Condition

Multiple condition with Logical operators have a tricky part that each logical operator can have two parts more than two should have another logical operator to be defined encapsulating it.

The condition((Name ==”Balaji”) && (Location==”Chennai”) && (ID==1))in CAML form would look like

<Where>
  <And>
    <And>
      <Eq>
        <FieldRef Name=’Name’/>
        <Value Type=’Text’>Balaji</Value>
      </Eq>
      <Eq>
        <FieldRef Name=’Location’/>
        <Value Type=’Text’>Chennai</Value>
      </Eq>
    </And>
    <Eq>
      <FieldRef Name=’ID’/>
      <Value Type=’Number’>1</Value>
    </Eq>
  </And>
</Where>

Normal Operator SharePoint  specific
= Eq
> Gt
< Lt
>= Geq
<= Leq
Like Contains
Null Check IsNull
Not Null NotNull
Begins BeginsWith
Date Range DateRangesOverlap
&& And
|| Or

Generally CAML queries can be easily generated using several tools. Refer the following link for more info http://msdn.microsoft.com/en-us/library/ff648040.aspx

SP Value types http://msdn.microsoft.com/en-us/library/aa558695(BTS.20).aspx

MEF with Silverlight 4

MEF – Managed Extensibility Framework. While building applications for SharePoint using Silverlight came across a scenario of dynamic loading of XAP and loading an application. The main reason moving towards this approach is scalability and to reduce the load time delay in loading the XAP files. This approach serves the request by dynamic loading of the content. I will come up with a simple application demonstrating MEF. Now I will post the quick ref links of MEF.

10 Reasons to use MEF

Blogs

Part 1 , Part 2 , Part 3 , Part 4 , Part 5

Videos

Channel 9 Video series

Demo

Cloud Light Application submitted for Mix code challenge just 10Kb of size(More..)

Dynamically show image in Silverlight

I had a requirement to show images in Silverlight dynamically and that was quite simple can be done in two steps in a minute but I spend nearly hours to figure out the issue.

Dynamically show images from code behind

var bitmapImage = new BitmapImage(new Uri(“http://www.google.co.in/intl/en_com/images/logo_plain.png”);
image1.Source = bitmapImage;

The above code will make you smile only if you run the application from the silverlight web application project and not the default genarated testpage.html

The whole point is that this scenario works only if the application is hosted or run from an local web application with fileserver.

Note: Only JPG,PNG,BMP works this way and GIF doesn’t go well. Check this post to make GIF work

http://www.eggheadcafe.com/tutorials/aspnet/c0046ba1-5df5-486a-8145-6b76a40ea43d/silverlight-handling-cro.aspx