Skip to main content

Posts

Showing posts from June, 2011

SharePoint Ports, Proxies and Protocols ....

Why is this important to understand? So the first question is "why do we care about farm communications"? .... There are three reasons why every SharePoint architect, consultant should know about this stuff and they are as follows: 1. Secured or ‘locked down' farms may have servers on different network segments and you may have to configure firewalls to only let the minimum traffic through. Without know what traffic needs to go where, this is a very difficult process. 2. Understanding network activities is very useful when trying to troubleshoot strange problems since SharePoint does not do a great job of reporting when there are network issues preventing something from working. 3. Windows Server 2008 and SQL 2008 are both ‘locked down' by default. Meaning that a fresh install of Windows Server 2008 will have everything disabled in Windows Firewall. Clearly you could simply enable all inbound / outbound communications (and I've done that once or twice myself in th

Differences between Crawler impact rules and crawl rules in SharePoint 2010 search

A crawler impact rule defines the rate at which the Windows SharePoint Services Help Search service requests documents from a Web site during crawling. The rate can be defined either as the number of simultaneous documents requested or as the delay between requests. In the absence of a crawler impact rule, the number of documents requested is from 5 through 16 depending on the hardware resources. You can use crawler impact rules to modify loads placed on sites when you crawl them. Crawl rules provide you with the ability to set the behavior of the Enterprise Search index engine when you want to crawl content from a particular path. By using these rules, you can: Prevent content within a particular path from being crawled. For example, in a scenario in which a content source points to the URL path such as http://www.test.com/ , but you want to prevent content from the "downloads" subdirectory http://www.test.com/downloads/ from being crawled, you would set up a rule for the U