Showing posts with label SEO. Show all posts
Showing posts with label SEO. Show all posts

Sunday, December 28, 2014

Adding Robots Files to your Web Site - Web Spiders


In this article we see step by step adding Robots Files to your Web Site. Web Spiders are usually called "Robots", and at the moment of accessing a web page, they are called "User-agents".
A Robots file is a suggestion or request made by you to the Web Spider, to not to index certain files or folders in your web site.
It is important to realize that the directives inside the resultant Robots.txt file are purely advisory for the web crawlers.
Adding Robots Files to your Web Site - Web Spiders



Adding Robots Files to your Web Site


Before we see step by step how to make a Robots file, take a look at the basics of the Robots syntax with this examples:

all robots can enter all files:
User-agent: *
Allow: /


all robots cannot enter some files:
User-agent:*
Disallow:/file1.aspx
Disallow:/file2.html


all robots cannot enter some directories:
User-agent:*
Disallow:/PrivateDirectory/

some robots cannot enter some directories:
User-agent:SomeRobot
Disallow:/PrivateDirectory/

After this short introduction, let's add a Robots file.

Adding Robots Files to your Web Site - Web Spiders


Open the IIS (CTRL-R + "inetmgr") , then open your web site and click on the SEO - Search Engine Optimization:

Adding Robots Files to your Web Site - Web Spiders 1

First you have to add a Sitemap, if you don't have one:
Adding Robots Files to your Web Site - Web Spiders 2

After you created a new Sitemap, click on "Add a new disallow rule":
Adding Robots Files to your Web Site - Web Spiders 3

Here select the URLs of the files and directories that you don't want to index:

Adding Robots Files to your Web Site - Web Spiders 4

For example:

Adding Robots Files to your Web Site - Web Spiders 5

Now let's add the request to index our web site:


Select the whole web site to be indexed:

Adding Robots Files to your Web Site - Web Spiders 6


Now let's see our rules:

Adding Robots Files to your Web Site - Web Spiders 7

As you see, ALL User-agents are ALLOWED to index our site, except for the Disallow rules:

Adding Robots Files to your Web Site - Web Spiders 8

Now let's see the txt file just automatically created:

 It looks as this:




That's all!!!! The same way, you could select all files or folders for exclusion from indexers.

You can learn about the free IIS SEO Toolkit , and how to install, analyze and optimize your web site, in this tutorial.


By Carmel Shvartzman



עריכה: כרמל שוורצמן

Sunday, November 9, 2014

How to Notify the Search Engines about new Content on your blog

In this article we describe Step by Step How to Notify the Search Engines about new Content on your blog, by using Ping-o-matic , a FREE online tool .
Here we learn how to inform to the different Search Engines such as Google,  Yahoo, Weblogs.com,  Topics Exchange , Collecta, SkyGrid, or NewsGator,  that your Web Site has been updated , and now contains expanded information to index .
 Ping-o-matic is an online WordPress Foundation FREE project , which allows you to avoid the  situation in which your new content is ignored by engines for some time.
In order to confront the problem, Ping-o-matic  pings more than 20 Search Engines , asking them to re-index your Web Site  , because new content has been appended.

Just by clicking a button, all relevant Search Engines are notified of your Web Site updates :

Update and Notify the Search Engines about new Content on your blog



Notify the Search Engines about new Content on your blog


Browse to Pingomatic.com , and find the following graphic user interface :









Type a name which represents your Web Site, and enter the URL of the Web Page where you added new content:



You can optionally add an RSS URL as your wish, but that's not required.

Then you can select from a list several Search Engines to notify, checking all  "Services to Ping" that you want.

If you don't care about which web services are relevant for you, just click on "Check All":




Notify the Search Engines about new Content on your blog


Finally, click on the "Send Pings" button :




Then you can check whether all Search Engines have been notified of the changes:




The next time that you browse to the Ping-o-matic web site, the fields will be automatically filled for you. Just change the URL field to that of the updated post, and that's all.


In this article we reviewed Step by Step How to Notify the Search Engines about new Content on your blog, by using Ping-o-matic
Happy programming.....

      by Carmel Schvartzman


כתב: כרמל שוורצמן

Sunday, August 10, 2014

IIS SEO Toolkit - How to install, analyze and optimize your web site

In this article we'll see how to install and use the Microsoft IIS SEO (Search Engine Optimization) ToolKit to improve the ranking of your web site by search engine crawlers. This is a completely free tool which will help you to better rank your website with search engines, giving you a detailed analysis of the warnings and the errors found and the number of times they show in the web site, performance issues, duplicate files, codes  404 errors, broken links by page, slow pages arranged by directory and content type, redirects and Link depth.

We'll analyze an application as follows:



Browse to www.microsoft.com/web/seo and install the IIS SEO Toolkit :




Follow the install instructions, and after finishing, open the Tool found at the IIS 7.0 Extensions in All Programs :



You can also find the Toolkit at the IIS Manager window :



Open the "Site Analysis" feature to start a New Analysis: just type the name of the website and give a "Name" to the Report :



You'll be prompted a Dashboard:



The following is the "Summary" of the report, summarizing the Violations:



See the pages with more violations:



This is the counter of the kinds of violations:






The "Details" window can give you the load time of the page in milliseconds:

It is very important that you get and resolve the broken links in your website, because websites with broken links are fairly punished by the search engines:



Take a look at the slow pages and try to reduce overload on them:



At web page level, you are given a thorough analysis of all problems with the html:



The SEO Violations are deeply specified and the more important thing is, you are given the ways to resolve the issues, at the "Recommended Actions" panel :



In case you want to see specifically where in the HTML is situated the violation, open the Content tab :


Some important points respect the IIS SEO Toolkit are the following:

1) The summary window provides a detailed analysis of the warnings and the errors found and the number of times they show in the web site.

2) View the web page in Browser : right-click an URL and press "View in Browser".

3) View Web pages Linking to this Web Page : right-click and select "View Pages Linking to This Page", to show all web pages on your site linked to the selected item, and will be affected if you change the URL.

4) View Web Pages Linked by the current Page : this shows all the resources that your HTML markup references. 

5) The Violations section provides information about what pages have the most errors, arranged by error levels , and categorized by content, SEO, web mark up, and so on.

6) Web Page Word Analysis : at some selected web page "Details" window, open the tab "Word Analysis" to get the most commonly used words in the current web page, which is useful for describing the page to the search engines.

7) The IIS Toolkit analysis includes also performance, content, and links sections, containing their own full data. Here are some important points among those IIS Toolkit sections:
Duplicate files, with descriptions and keywords; Performance issues; Most linked pages; 404 errors; Broken links by page; Slow pages arranged by directory and content type; Redirects and Link depth.

It's also possible to export this data into an Excel file.

That's all!!  
Happy programming.....
        By Carmel Schvartzman
כתב: כרמל שוורצמן