How To Index a Website in Google Search within 24 hrs

Do you want more organic search traffic to your site? I’m willing to bet the answer is yes – we all do!, and having a website is now an essential part for an online presence.

Many business are now creating their own blogs to get their sites indexed faster and get the attention of customers within their niche. I have several websites and I have got all of them indexed within 24 hours. Indexing is not a brainer and its not even that easy if you don’t know exactly what to do.

How do you get your new site or blog indexed by Google, Bing and other search engines? Well, you’ve got two choices. I don’t know about you, but I’d rather get my sites indexed as quickly as possible, because it gives me more time to build my audience.

If you have recently bought your domain, you’ll want to first check to see if Google’s already found and indexed it or not.

The easiest way to check this is to use a site:domain.com search in Google. If Google knows your site exists and has crawled it, you’ll see a list of results similar to the one for SmartActiveBlogger.com in the screenshot below:

Index a Website in Google Search

If Google hasn’t yet found your site, you’ll get no results at all, similar to this:

Index a Website in Google Search

What is Indexing a website?

Indexing is done by Search Engines like Google, Bing through a spider or a bot. In simple terms indexing a website means showing your site on search results.

Indexing is controlled by meta tags and robots for the bots to let them know whether to index or noindex or crawl a website or not.

Indexing is the process of adding information from blogs to the search results which are gathered by Googlebot. The bot processes all the words and tags and then filters by its quality.

Guide to Indexing a Website in Google Within a Few Hours and Get Your New Website or Blog Discovered

So how can you get your new website indexed and discovered by Googlebot. Typically, you have to wait around for the Googlebot to crawl your website and add it to the Google index.

Here are the most possible ways you can use to index your website within hours. I was able to index my website within 5 hours. This strategy will work on every type of websites, as most of the technique we are going to use in this guide are external resources.

So how can you get your new website discovered by the Googlebot? Here are some great ways.

[ Note: The best thing about some of the techniques are that you will also get referral traffic ]

  1. Create at least 10 quality content
  2. Interlink your contents.
  3. Optimize web pages for long keyword
  4. Submit your content to social Sites
  5. Submit your blog to web directories
  6. Create a XML Sitemap
  7. Submit Sitemap to Google Webmasters Tool
  8. Submit website URL to Search Engine
  9. Create Social Profiles and Pages
  10. Start your RSS with feedburner
  11. Commenting on Blogs
  12. Guest Posting
  13. Ping Services
  14. Add a Blog
  15. Use Robots.txt
  16. SEO tags

1. Create at least 10 quality content

Before launching your website and turning it accessible for the public, you should atleast write 10 quality contents which have been through proofreading , editing and optimizing for SEO. By doing this you will have quality contents at your disposal and you can focus on other important aspects rather than rushing around at the last minute.

2. Interlink your contents.

This is the biggest mistake that far too many website owners make. They completely ignore interlinking their articles and thinks that related contents at the end of their articles will do all the rest.

Launched a new website? Download this cheat sheet to learn how to get Google to instantly index your website and increase your crawl rate.

Google also takes anchor text as ranking factor and also allows the Googlebot to scan deep links and articles. As Googlebot has a budgeted time, so helping it to scan more contents on your website will be beneficial for you.

  • Make it easy for spiders and bots by thorough link on your post/pages.
  • Make sure your interlinking is logical and has raich anchor text.

3. Optimize web pages for long keyword

Don’t focus on main keywords as competition is high on those and you can be easily outranked by the big players. Use tools like Google adwords, KeywordTool.io, LongtailPro, Semrush and other.

Don’t get involved in publishing several contents in a single day. Google shows quality contents instead of quantity. Once you have found a few long keywords that you think fits your website content, start optimizing your pages by strategically placing those keywords throughout your content.

4. Submit your content to social Sites

Social Sites get lot of attention from search engines, this is very effective method to get your site noticed to googlebots and get your website crawled and index within hours. Linkedin, Quora are some of the examples of social sites where you can submit your content.

Social site like this will also get you good amount of traffic, as this websites are full of people seeking different contents.

5. Submit your blog to web directories

Web Directories were created for two purposes. First to let people know what blogs are available on the niches they are interested and secondly to let search bots know about your site and crawl.

There are several web directories around the web and not all of them are good. You should do a research on the good web directories available on your niches before submitting your website.

Note: Don’t use Link exchange Web Directories.

6. Create a XML Sitemap

Ooh! I hope you know about sitemap and already have one. If you dont have a sitemap then How To Create WordPress Sitemap Using Google XML Sitemap Plugin will answer all your needs.

A sitemap is created so that the search engine bots and crawlers can effectively crawl and index your website. There are several plugins for WordPress to create sitemaps like Yoast SEO, All In One SEO. If you have a static website then search for tools to create sitemaps.

7. Submit Sitemap to Google Webmasters Tool

Submit your blog sitemap

The next thing after you have created your sitemap is to submit it to Google, Bing, Yahoo, baidu, yandex and other search engines.

Search engines will not find all of your contents if you don’t submit your sitemap. You need to create your webmaster tools account and submit your sitemap to that search engine.

You can submit your sitemap to Google webmasters tools by accessing your webmasters account > crawl > sitemaps. We have a complete guide on How To Submit Your Blog Sitemap To Google Search Console

8. Submit website URL to Search Engine

There are several ways to get the search engine’s crawler to your website, and many people ignore this step and focuses on other factors. Submitting your website url to search engine only takes a moment and it doesn’t hurt you in any way.

To submit your blog to Google, you need to signin to your Google account and click the Submit URL option in search console.

9. Create Social Profiles and Pages

As mentioned previously, that social networks get tons of visits from search bots and crawlers gets to your site by the links shared on it. So you should have your own accounts and pages on social networks with your link on them.

This include facebook, google+, twitter, linkedin, instagram and others. Their is a little trick on creating your account on social networks. You need to use your Brand Name as your social account username, or URL.

Like “SmartActiveBlogger” has facebook, youtube and google+ account with the same name.

10. Start your RSS with feedburner

Sign up for Feedburner, it is Google’s own RSS management tool. Submit your blogs feed url to Feedburner’s Burn a feed URL.

In addition it will also notify Google of your new blog  updates and contents. Feedburner will also notify other services that rely on feeds, like news aggregating site and contents sharing sites.

Don’t worry, they don’t copy your content, they will link you.

11. Commenting on Blogs

Commenting on blogs that are relevant to your site will boost your Visibility to search bots.  commenting on a do-follow link blogs will help you search bots visit your website, with an addition of getting you a backlink.

Most blog enables nofollow attributes on any external links, but commenting on them can benefit you and cause you no harm.

If you want to comment on wordpress blogs,  then  search for blogs that have commentLuv installed. commentLuv is a wordpress plugin that enables commenter to add their latest articles with comment.

12. Guest Posting

Guesting Posting is the best way you can do to get backlinks, referral traffic, recognition in your niche and also the search bots to crawl and index your website when they visits the website where you are guest posting.

Guest posting is the method used by top marketers and bloggers to get juice links from top authority blogs by publishing contents on their blogs, this contents are usually high quality in nature.

13. Ping Services

Ping service are like doorbells for search engines, by pinging you tell the search bots about a new content or an updated content on your blog and request bots to come and crawl your contents.

WordPress has a built-in feature to ping as many services you want. Pingomatic and Pingler are such great tools to ping.

14. Add a Blog

Blog content gets crawled and indexed more quickly than static pages. You have a business website then adding a blog will significantly increase your traffic, crawl rate and sales (may be). Having a blog always gives you the benefit of outranking your competitors and increase your business as now you have a blog and product to compete in your niche.

Blog now has turned into a must have thing for all business website and companies. Blogs also bring in more traffic. Businesses that blog regularly generate 55% more visitors to their sites than those that don’t, according to HubSpot.

15. Use Robots.txt

Optimize WordPress Robots.Txt

Robots.txt is a file that gives strict instructions to search engine bots about which pages they can crawl and index – and which pages to stay away from. Spiders read this file in your new domain before doing anything on your blog, in the absence of robots.txt the spider will assume to index and crawl all the pages.

Now you might wonder “Why on earth would I want search engines not to index a page on my site?” That’s a good question!

In short, you don’t want to index your admin pages or some confidential pages, like downloads. Every page that exists on your site should be counted as a separate page for search result purposes.

Note: It’s very important to use only a plain text editor, and not something like Word or WordPad. It may add some codes in the background which may conflict with the bots

WordPress bloggers can optimize their robots.txt files by using reliable a wordpress plugin like Yoast’s SEO plugin.

16. SEO tags

SEO and SEO tags and attributes can impact on your indexation by google and other search engines. By far many people just keep their focus on SEO title and descriptions but don’t even thinks about SEO tags and attributes like do-follow, nofollow, index, noindex and others.

SEO is a vast topic and really hard to follow all the changes going on it. We have covered 15 Best Ways To Increase Google Crawl Rate Of Your Website.

Conclusion

There you have it – sixteen ( 16 ) methods for getting your new site or blog indexed quickly by Google and other search engines.

That’s all, we hope this article helped you index your website in google search within 24 hours. You may also be interested to know the 5 Best SEO Plugins For WordPress.

If you liked this article, then please subscribe to our YouTube Channel for WordPress video tutorials. You can also find us on Twitter and Facebook.

Do you have experience with any of these? Would you add others to the list? Share your thoughts in the comment section below.

 

Best WordPress Database Backup Plugin

Do you want to know how to make a WordPress database backup using some plugins. There are Free and premium plugins that will help you backup your WordPress database easily within minutes. In this article, we will show you how to make a WordPress database backup using 6 plugins.

When and Why Make a WordPress Database Backup

You should always have a wordpress backup system for your site, as this gives you the option to restore your site if anything goes wrong. Also you should use 3rd party backup solution instead of backup service provided by your hosting provider.

Many users loses their contents and data if their site is hacked or unoperational, in those cases you may need to restore, you can download a new copy of wordpress from wordpress.org but what about your contents and data.You can only have your contents restored only if you have a backup.

You can create backup either by plugins or by using cPanel or do a manual backup. (how-to-make-a-wordpress-database-backup-manually) Having said that, let’s take a look at how to easily make a WordPress database backup manually.

1. BackWPUp

BackWPUp WordPress Database Backup Plugin

BackWPUp is a free that allows you to create complete what is backup not just a database and also allows you to store in your favourite cloud services like Dropbox, Amazon S3, rackspace, FTP, email or even your computer. I have used several wordpress backup plugins but I have to say BackWPUp has got a very clean user interface and it is really simple to use.

If you want to use Google drive as you cloud backup service then BackWPUp has got a premium version that allows you to have the Google Drive feature and restoring  your database is really easy and very simple and hardly takes about 2 minutes.

 

2. WP Database Backup

WP Database Backup - WordPress Database Backup Plugin

WP Database Backup is a free plugin and it helps you easily create WordPress database backups. It allows you to upload your backups to cloud services like Dropbox, Google Drive, Amazon S3, FTP and even email.

WP Database Backup have some great features some of them are like the automated and repeating schedule, option to download the backup file directly from WordPress dashboard, send to multiple cloud servers, search for a particular backup, and is very simple to configure.

At the time of writing this post it has around 60000+ active installs and has a rating of 4.5 out of 5 stars.

 

3. BackupBuddy

backupbuddy-review

The great features of backupbuddy is:

  • It automatically creates regular backup schedule.
  • Backup on your cloud of your choice
  • Comes with one click Restoration and migration for backups
  • And backups all your media files folder and even database

BackupBuddy is a premium plugin that cost around $80 to $297 depending on the licence you choose. We have a dedicated post on how to use BackupBuddy to protect your blog.

 

4. WP-DB-Backup

wp db backup

WP DB Backup is another backup plugin just like WP database backup, and it allows you to easily backup your WordPress database tables and also other tables that are created by some plugins for any other softwares.

At the time of writing this post WP DB backup has 4.5 ratings out of 5 stars and nearly after 400,000+ active installs. The only downside of WP DB Backup is that there is no more documentation pictures for anything else to know how the plugin works.

 

5. WP-DBManager

WP-DBManager - WordPress Database Backup Plugin

As the name suggests WP-DBManager allows you to manage your entire database from one place and its not Just backup WP-DBManager allows you to optimise database, repair your database, backup, restore and also delete database. It also supports other queries like dropping table and running some selected queries on your tables. Like every other backup plugin WP-DBManager also comes with automatic scheduling of backup optimising and repairing databases.

The interfaces of WP-DBManager is really simple and clean it allows you to set the path where you want dump all your MySQL values and the log files and also I allows you for automatic scheduling and allows you to choose the time like week’s, months and years. It also has a great features that most of the bloggers and other website owners would love is the optimisation of your database.

Everyday use of your wordpress will store several files they are not necessary, WP-DBManager really helps you clean all the junk files, delete the old post revision that no longer needed and also some transient values and meta data that have expired or no longer in use.

At the time of writing this post WP-DBManager have a 4.4 rating out of 5 stars and100,000+ active install.

 

6. BackUpWordPress

BackUpWordPress - WordPress Database Backup Plugin

BackUpWordPress not only allows you to create backup for the database but also allows you to create a complete backup of your entire site and even you can schedule the files for a backup. BackUpWordPress is super simple to use and require no setup it is also very low on memory means it will consume less memory and works on shared host like shared hostings.

Some features of BackUpWordPress are like managing multiple schedule, email you when Backup is done and it works on Linux and Windows servers. Option to include exclude files and folders for backup and also it can be translated into several languages.

BackUpWordPress has 4.7 rating out of 5 stars and the total active installs of 200,000+.  Currently BackUpWordPress has got the highest ratings from all the plugins that have been mentioned till now in this article.

 

7. WP Time capsule

wp time capsule

Many startup bloggers and website owners wants to have complete wordpress backup solution for free.  WP Time capsule is not just a simple plugin for backup it is a complete automated incremental backup solution for any wordpress blog and it is absolutely free. Creating a backup for your site is the best thing you can do for yourself  and WP Time capsule help you achieve that.

WP Time capsule supports only three cloud services that are Dropbox, Google Drive and Amazon S3.  To use wp time capsule you have to download the plugin and also have to sign up on the website.We on smart active blogger make use of WP time capsule to have complete backup. We have a complete guide on how to setup your backup using WP time capsule.

Conclusion

Backing Up your site is the best thing you can do for yourself and your site. It gives you a peace of mind and also assures that your blog will never die due to a hacker or some technical problem.That’s all, we hope this article helped you learn how to create a complete backup for your site for free. You may also be interested to know the 9 Best WordPress Plugins For Backup And Restore.

If you liked this article, then please subscribe to our YouTube Channel for WordPress video tutorials. You can also find us on Twitter and Facebook.

Do you have experience with any of these? Would you add others to the list? Share your thoughts in the comment section below.

How To Submit Your Blog Sitemap To Google Search Console

If you have just started blogging then you would be curious that how your blog can appear on google search results and drive traffic.

Many newbies think that they need to pay to Google or hire some agency in order to show up their website in search results.

But, Its Free.

Yes! Listing your blog on google search results page is free and would require a few easy steps from your side.

If you want your website in Google as soon as you start your website then our guide How to index a website in 24 hrs in Google search will definitely fulfill your requirements.

Google Search Console a.k.a Google Webmasters Tool is the tool that we are going to use. It is a free tool by Google which will help you list your blog and show up on search results.

To get started with Google Search Console, you need to verify that you own that domain and your are the current owner. To verify you will need to add a meta tags or other verifying methods.

After you have verified your website in Search console the next step is to submit your blog sitemap to google search console by submitting your sitemaps.

So before you begin, you should know what is a sitemap and why it is important to submit your sitemap to google search console.

What is a sitemap?

A sitemap is a map of links of your website. The sitemaps is a XML file that contains all the links of your website. New websites are missed by google crawlers when they go around the web to crawl and index new links.

As new websites have fewer links and contents, the bots are not aware of them so by adding a sitemap to google search console you let the bot know about your blog and all the contents on your website.

A sitemap has various properties that can affect the performance and how its crawled by bots. This are size of the sitemap, number or links, priority and content type.

In WordPress sitemaps can be generated dynamically by using a plugin, but if you have a normal website then you need to use sitemap generators, but you need to repeat this process every time you create a new content but on WordPress it is a one time work.

Now you have known about sitemaps, its time to get your website listed on search results.

Submitting Your Blog Sitemaps to Google Search Console:

I will assume that you have created your sitemap by using a plugin if you are using WordPress or by using some online service if you have a static website. If you don’t have a sitemap, you can create a XML sitemap for WordPress in 2 minutes.

If you have created your sitemaps using offline apps then you need to upload it to the root directory of your website.

So now login to your google webmasters tools and select your site. From the menu on the left side, click on Crawls > Sitemaps > Add/Test Sitemaps.

Submit Your Blog Sitemap To Google Search Console - sitemaps

After submitting your sitemaps, you will need to refresh the page or reload it. Google webmasters tool will show you whether the sitemap has been accepted and submitted or not.

Submit your blog sitemap

Depending on the size and the type of your sitemap, it may take time to render. If your blog has lots of images and custom post types, it is advised to use separate sitemaps for each of them as it will make it easier for you to maintain them and for google bots to crawl and index them.

The tool also shows the status of the sitemaps, like the number of web pages submitted and indexed, It also shows a separate status for images.

Once you have added all the sitemaps to Google search console, you have done all the steps required to get your blog showed up on Google search results.

That’s all. We hope this article helped you learn how to  Submit Your Blog Sitemap To Google Search Engine. You may also want to see our guide on 15 Best Ways To Increase Google Crawl Rate Of Your Website that you should definitely read.

If you liked this article, then please subscribe to our YouTube Channel for WordPress video tutorials. You can also find us on Twitter and Facebook.

Do you have experience with any of these? Would you add others to the list? Share your thoughts in the comment section below.

How To Optimize WordPress Robots.txt File for SEO

If you’ve ever wondered:

“what is a robots.txt file?”
“Do you need a robots.txt file?”
“How do I Optimize WordPress Robots.txt File for SEO?”
“what is the Importance robots.txt ?”

Recently one of our readers asked us if they need to have a robots.txt file and what importance does it have on SEO. So we have created a step-by-step guide that will show you EXACTLY how to optimize robots.txt file for seo.

So what is Robots.txt?

Robots.txt is your anti-sitemap and instead of telling Google about your blog contents you want the bots to index, the Robots.txt file stops them and tells them what pages of your site that you don’t want to index, and show on search results.

Robots.txt plays a major role in SEO and search engine ranking but if you configure the robots.txt file in a wrong way your presence from search engines may completely be wiped.

SEO is a vast in itself and it has lot of elements in it and robots.txt is one of them.

Here is the breakdown of the contents:

Do You Need a Robots.txt File?

Robots.txt file puts a stop on the search bots to index your website’s particular areas. This doesn’t mean it is a bad thing. Actually an absence of robots.txt file will not stop search engines from crawling and indexing your website.

Robots.txt file protects your confidential files, admin folders and other files and folders that you don’t want to show up on search results and also tells them which file to crawl as Robots.txt file is the first file that is crawled by bots.

If you submit your XML sitemaps on search engines, then the bots know where to look for your contents and crawl it unless you have specified it in Webmasters Tools.

We highly recommend you to create a robots.txt file immediately if you dont have any.

Where is the Robots.txt file in my website?

Robots.txt file is usually found in your site’s root folder, if its not there then you need to create one.

In WordPress it resides in the root directory.

SmartActiveBlogger also runs on WordPress and out robots.txt file is located in the root directory – https://www.smartactiveblogger.com/robots.txt

How to Create a Robots.txt file for my website?

You can create robots using your cPanel, Hosting File Manager or with an FTP Client. We will use FTP Client to create the Robots.txt file. The process is same for cPanel, you just need to Login to your account.

First we need to connect the FTP Client with our WordPress directory, or login to our cPanel and navigate to File Manager.

Use a notepad and create an ordinary text file and name it robots.txt. Next simply upload it to your site’s root directory. If you already have a robots.txt file then you don’t need to create a new one.

How to Use Robots.txt file?

If this is your first time to robots.txt then it may feel little scary, but don’t worry. We have created this guide for you keeping in mind that you are completely new to robots.txt

The format of robots.txt is pretty simple, the text “user-agent” is actually the name of the bot that you are trying to instruct.For example, Googlebot or Bingbot. You can use asterisk * to instruct all bots.

The next line follows with “Allow or Disallow” command, so the search bots know either it should follow the mentioned directory or not.

If set to Allow, the search bots will crawl and index the ones and if Disallow, it will ignore and won’t index the directory and its contents.

See how a sample robots.txt file looks like


In this sample robots.txt, the * means all bots are instructed the below commands, and Allow: /wp-contents/uploads/ means the contents inside the upload folder will be index by the bot.

In the next two lines, we have disallowed 1 directory and 1 file.

Optimizing Your Robots.txt File for SEO

Optimizing your robots.txt for seo is quite simple, but you should follow the guidelines of google webmasters. Google advises users not to use robots.txt file to hide low quality contents on your website like category, date and other archive pages.

WordPress plugins that you use to add meta tags and descriptions for your contents can be used to manage and optimize the seo by adding nofollow and noindex to archive pages.

Other files are login page, registration pages already have the nofollow and noindex tags, so you don’t need to add them in your robots.txt.

It is recommended to you to add disallow to the followings:


Why this files are disallowed?

This files can be used by a hacker or someone who is running a malicious query to locate websites that are running a specific version, only disallowing will protect you as it won’t show up on search results.

Plugins folder is recommended to disallow as someone can find sites having a particular plugin which are vulnerable and can be exploited.

Adding Your XML Sitemap to Robots.txt File

If you are using any plugins for creating XML Sitemaps then those sitemaps will not be added to your robots.txt. You should add your sitemap URL in your robots.txt, by doing this you make sure that the search bots can find all the urls of your blog that it needs to index.

Adding your XML Sitemap in robots.txt can help your site get index if for any reason you forgot to add your sitemap in your webmasters tools.

What Does an Ideal Robots.txt File Should Look Like?

For every website the robots.txt file differs.

Here is another example of a robots.txt file, this time it is the one we use here on SmartActiveBlogger:


That’s it for your robots.txt

Is My Content Affected By New Robots.txt File?

Well after doing all those changes you should atleast know if your contents were affected or not. And to check that we have a fantastic tool in Google webmasters known as Fetch as Google located at Crawl > Fetch as Google.

In Fetch as Google tab, add any link of your post and test it by clicking fetch and render. The tool will let you know about the bot type and the status.

You can also check the effect of robots.txt by visiting the robots.txt tester in the webmasters which is located under Crawl > Robots.txt Tester. You just need to add any link from your blog that you want to test if the bot has access to it or not, just click the test button and you will find the status.

That’s all. We hope this article helped you learn how to optimize your WordPress robots.txt file for SEO. You may also want to see our guide on 15 Best Ways To Increase Google Crawl Rate Of Your Website that you should definitely read.

If you liked this article, then please subscribe to our YouTube Channel for WordPress video tutorials. You can also find us on Twitter and Facebook.

Do you have experience with any of these? Would you add others to the list? Share your thoughts in the comment section below.