If you submit a sitemap of your website, it will allow Google to better index all the pages of your site which means you should receive more traffic. More traffic means more revenue.
Google is usually very good at indexing pages of any site, however, sometimes, they are not able to index all the pages. This can create problems for you. Therefore, you need to submit your sitemap.
Sitemaps are required to be created in XML format only and you cannot add more than 50,000 URLs in a single sitemap. Each URL represents a unique single page of your site. Your sitemap size should also be not larger 10 megs according to Google standards. If the number of URLs is more than that, you can simply create more than one sitemap and submit them separately.
There are a lot of websites and tools that will allow you to create a sitemap of your site in no time. After you have uploaded the sitemaps, Google will display all the information regarding your indexed and non-indexed URLs. Don’t be surprised to see a few URLs here.
Crawlers are a small army of robots that Google sends on your site to index pages. But, what if you do not want some pages of your site to be indexed. These pages can be private or they can be from the membership area or profiles. Whatever the reason, you don’t want them to be found by the world and Google understands that. So, they have provided a solution.
You can control which pages of your site will be indexed by creating the robots.txt file where you can define rules for Google bots or crawlers. You can not only block Google but also every other search engine on the Web. If the pages have been already indexed, you can use the webmaster tool to request Google to remove them.