Status
Not open for further replies.

Captain Crunch

New Member
Does anyone know whether submitting a sitemap to Google will increase the page rank of my sites pages other than the homepage etc?

In other words will submitting a sitemap alone have any affect on my PR?

Thanks
 

louie

New Member
No it won't but it will help Google spider your website better.
 

link8r

New Member
MHO: It won't increase your pagerank, the number of pages indexed but it will let you see what Google is indexing and what it's not. That may help you get more pages indexed but all pages are indexed on Google's perception of importance. So if you have a page thats 16 links deep, or is an orphan page - Google may not index it. With a sitemap, you provide a definite map where Google can confirm if it's indexed that page. It may help Google discover the orphaned page but it's unlikely to make it index it.

Hope that helps!
 

Satanta

New Member
No it won't but it will help Google spider your website better.
It doesn't have a (direct) impact on the PR.

As louie says it can help the SE index your site. If this results in them indexing additional pages, it can then lead to the internal linking providing a slight PR boost to other pages. However, if a page wasn't already indexed, chances are that it has very few (if any) links worth talking about, so the difference in PR will be negligible.

If you've gone to the effort of creating unique content for a page, you do want that to be available to searchers. Submitting a sitemap to assist in the indexing of your site is certainly a good thing, even if it doesn't make an impact on the PR side of things.
 

cools

New Member
Both XML and HTML sitemap is important for a website...HTML sitemap is used by the visitors visiting the website and XML sitemap is used by the crawlers crawling the website...so both are important for website popularity...
 

link8r

New Member
Both XML and HTML sitemap is important for a website...HTML sitemap is used by the visitors visiting the website and XML sitemap is used by the crawlers crawling the website...so both are important for website popularity...

Ummmmm popularity? How is an xml sitemap important for popularity?
 

Captain Crunch

New Member
Can anyone suggest any free software for creating an xml sitemap? Most that I have found only map the first 500 pages they find and I have more pages than that. Sorry but you are all dealing with a novice here!

Thanks for all the help
 

I4Visual

New Member
A site map is actually a very good thing to have for any site, it's goof for SEO. As for your question... It won't directly improve your PR, no, it may just contribute to that, together with other well executed SEO tactics.
 

link8r

New Member
It doesn't help Google find your pages. If Google can't find your pages, it won't be interested in them. You'll be surprised just how much Google can get from a webserver.

The sitemap is important so that YOU know how many pages Google has found and how many it's indexing.
 

TOPriceIE

New Member
I hope that this thread isn't too old.

Just on different side of this question. If you have a site with over 1mln pages how do you get sitemap? It takes about a week to generate it and in the mean time number of pages and other data will change few times so the map will never be accurate.

Also if you type in site:xxxx.xx does it show number of pages currently indexed by google? and why is there different number of results in different region, ie: google.ie and google.com also prefixed with www and without.
 

Claudiu

New Member
Let's take them one step at a time.

First of all, a sitemap can't have more than 50k URLs. But you can create a sitemap that contains links to other sitemaps. Read sitemaps.org - Home
Regarding the time it takes to create the sitemaps and data changing:
- A sitemap just contains the URLs. If you're aiming that high (1 mln pages) you should have a system that generates sitemaps dinamically already in place. Data should never change, URLs shouldn't change. It doesn't matter what content they have as sitemaps only contain the URL of the page, not the content (submit sitemaps to Google please, not feeds). Also, in a sitemap, you can specify how often the content of a page is changing so Google knows how often it should crawl it.

Regarding the different number of indexed pages:
- The reason for Google having more domains, google.ie, google.com and so on, is so it can target results by location. It is different as the content of an Irish only tourism site for example is more relevant to google.ie than google.com if you're searching for hotels for example.
- You should never have both Example Web Page and example.com. www is actually a subdomain and it has been used as the actual webpage URL. It should redirect to the domain name (example.com) or the domain should redirect back to the subdomain (Example Web Page). Having them both up results in duplicate data so most likely Google will penalize both websites (yes, it's the same for us, but for Google those are two separate websites). So put the redirection in place and you'll get rid of a lot more problems.

Also, keep in mind that Google will not index all of your pages even if you do submit a sitemap for all of them. The pages have to be relevant and have backlinks.

Hope this helps!

Regards,
Claudiu
 

TOPriceIE

New Member
Hi Claudiu,

Thank you for your reply.

I guess I was talking about a number of url's containing products. I am not sure what is the actual number of url's on the site (without the products). How do I generate such a sitemap? I have used Gsitecrawler but as I mentioned it gave me all the links (feed as you call it).

Going back to two domains with and without www, how do I change it? As it comes to webmaster tools I have it setup as preferred domain: www.example.ie, is there anything else I should do on this matter?
 

mneylon

Administrator
Staff member
I think I'm going to split this thread, as it's more to do with optimising a large site than dealing with Google sitemaps ..
 

Claudiu

New Member
@TOPriceIE

The website that you're talking about uses PHP and stores products in the database. By analyzing the way URLs are created on the website, you can recreate the URLs. For example,for my blog, I know that I have a page for each article and I know that the structure is domain-name.info/article_id/article_title/ . So I just generate a sitemap with every article in the database. To that, I add a sitemap with all the categories (which I take from the database as well).

Same goes for your website. You should have a PHP class that generates sitemaps based on the data in the database and the way the URLs are created. I strongly discourage crawling your own dynamic website to create a sitemap. If you have a simple HTML+CSS website that has gathered hundreds of pages, yeah, go ahead and crawl it to create a sitemap, otherwise create it dynamically.

A feed is different from a sitemap (I'm refering the RSS feeds you see on most websites). A feed provides more info about the content and less info about the way the content is changing. More exactly:
- A feed contains the title, description of the article, url, date and so on, but it doesn't say how often the content of the page is changing
- A sitemap contains the URLs and data that you want to pass to Google such as how often you want that page crawled for new info.

The developers of the website should know what the sitemap structure is and how to generate it dynamically.
 

pfurey101

New Member
Give G, Y and B what they want!

They look for an XML sitemap in webmasters, so why not? (Tis great SEO anyway) and the HTML sitemap is great for your visitors.
 
Status
Not open for further replies.
Top