Study material from MSDN, Technet and others

Plan for availability for SharePoint sites

http://technet.microsoft.com/en-us/library/cc748824%28v=office.14%29.aspx

Customize / Code behind for list forms

http://thesharepointdive.wordpress.com/2012/03/20/list-forms-deployment-for-sharepoint-2010-part-1-of-4/ 

Repeated authentication prompts from SharePoint

Some users complain about repeated requests for user name and password whenever they accessed the site which is very irritating. With some users entering the user name and password a few times it worked, and some users would request 10 times and then show the site, and sometimes would just result in a blank, empty page. While entering username password so many times, some users accidentally enter wrong password and end up locking their account. When I faced the issue, I tried to switch the browser, and voila it worked.


If you search the issue, you will find a lot of references to authentication providers, registry issues, trusted sites or intranet zone settings or maybe the users are on the server and the loopback check was not disabled.
But why it sometimes worked and sometimes it didn't.
I found that to some extent Content Advisor, Web sense, Firewall ,Proxy settings, either of it could also be a problem.


Turns users had a proxy server, with proxy settings deployed to desktops via group policy.The group policy forces the browser to ask for authentication on some content which according to the policy may be inappropriate. The proxy would behave differently every time it was asked for the new web site address, and would cause an authentication prompt for each image on the page

You could confirm this yourself by temporarily switching to a different browser than your default one. If the new browser doesn't take the policies or Content Advisor settings your default browser and your site works - you found your answer.You can ask network admins to either add the web site to the proxy exception list or fix the proxy server itself or switch over to a new browser or revise the Content Advisor.

Note: This article doesnot consider the repeated authentication prompts on Web Server due to loopback issue and Alternate Access Mappings. You may refer Guide to Alternate Access Mappings in Sharepoint

Guide to Alternate Access Mappings in Sharepoint

Below are the steps that you can use to configure Alternamte Access Mappings for you Sharepoint Site.

1 Add the URL to DNS with a pointer to the servers IP address. or add an entry to HOST file.
2 Add a Public URL in Alternate Access Mappings.
3 Add a binding for the Web Site in Internet Information Services. (Sharepoint doesn't do so).
4 Add the site as a local intranet site in IE to avoid logon prompt.
5 Verify access.
Note: If you cannot verify the access on Web server check ifloopback check.
A workaround for above issue is available in Mircosoft article
http://support.microsoft.com/kb/926642/en-us

As a complete article reference use below link
http://social.technet.microsoft.com/Forums/zh/sharepoint2010setup/thread/7ab4f87c-2246-4531-91bf-e0ec9f07a10a

How to setup robots file for your site

What is a Robots.txt

Robots.txt is a text (not html) file placed in the root of your site to tell search robots/spiders which pages should and should not be visited/indexed/crawled. It is not mandatory for search engines to adhere to the instructions found in the robots.txt but generally search engines obey what they are asked not to do.

It is important to note that a robots.txt does not completely prevent search engines from crawling your site (i.e. it is not a firewall) and the fact that you may have a robots.txt file on your site is something like putting a note "Please, do not enter" on your unlocked front door. Put simply, it will not prevent thieves from coming in but the good guys will not open to door and enter.

It goes without saying therefore, if you have sensitive data, you cannot rely 100% on a robots.txt to protect it from being indexed and displayed in search results.

The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it. They do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://www.sitename.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, don't be surprised that search engines index your whole site.

Purpose of a Robots.txt


The main reason why robots.txt would be used is to keep sensitive information private.

How to setup robots file for your site



Launch Notepad
Put the following in your robots.txt file:

User-agent: *
Disallow: /

Save the file as: robots.txt
Adding a robots.txt file to the root of your public anonymous site.
You can add it in the root directory of your Visual Studio Website Project.
You can place it directly in Virtual directory at root level of your website folder.


Adding a robots.txt file to the root of your public anonymous SharePoint site.
Open up your root site in SharePoint Designer.
Double Click the folder All Files
Drag and drop the newly created robots.txt to the All Files folder.
Exit SharePoint Designer.
Alternatively you can create the robots.txt from within SharePoint Designer itself.

To ensure the file is accessible to search engines go to your site URL and append "/robots.txt". Example: http://www.sitename.com/robots.txt

Additional reading can be done on
http://www.robotstxt.org/robotstxt.html

Note:
What if you failed or skipped to put a robots file before you deployed your pages and search engines have already crawled the sensitive content.
Well we have to run Webmaster tools of each search engine and explicitly request each page to be removed from search results.