SEO, an acronym of Search Engine Optimization, in simple words is a technique to make websites capable to be ranked on top list of search engines like Google, Yahoo, Bing etc.There are some procedure which helps websites to be listed on top when people search about some topic, that’s what SEO is.
I will explain with a simple example. Suppose a library has hundreds of thousands of books in a library and a person wants to borrow a geographical book and asks librarian. If the library has not organized books in particular order and certain process, it may take months to find a book. Similarly, if you have a website and your website has not implemented certain techniques to make search job easy for Google, Yahoo, no one will visit your website. Because, search engines can't list your web page. So, if you want to increase traffic of your website, you must pay attention for SEO.
There is no hard and fast rule to improve SEO of websites. There are some techniques and tools which could optimize to appear websites on top output list of search engines. Some techniques are as below:
Every webpage has two fundamental parts head section and body section. Head section has very important tags which contain information about web page itself, which means the tags not related to contents of body, but about the web page itself. ≶title> tag of head section is the most important tag of <head>tag. It is like when you are searching any topic on the book, if the title of the page is clear in table of contents page, it will be more easier and quicker to find that chapter. Similarly, if every web page with different topics have different descriptive title, search engine will easily and quickly find those pages.
The best practice would be write specific and descriptive words or phrase in title of pages and rather than creating lengthy body of web page, it is good idea to create different landing pages for different topic.
Again, one of the most important tag of head section of web page is tag. There are different names of meta tag like keywords, description,content-type etc. One of the main job of SEO expert is to create unique keywords for the web pages related to content. If you have ten pages in your web site with different topics, best practice is to generate unique keywords, description and title for every pages.
One of the main job of SEO expert is to create unique keywords for the web pages related to content. If you have ten pages in your web site with different topics, best practice is to generate unique keywords, description and title for every pages.
When you search any word or sentence in search field of search engine, it lists all related results. The first heading of search result with underline is the page title, underneath page title is url and underneath is description provided in meta tag.
Contents of meta tag don't appear on any part of website, however they are description of meta tag appears on search results underneath url as in above image. Meta tags are important for search engine as they hold the critical information about web page itself.The more accurate and unique is the content of description, the more chances to appear in top search list is.
Keyword tools are the set of different tools to create unique keywords and those keywords can be used to name website and inside content of keword tag. There are different tools to generate keywords.Few free tools are as below:
To use Google's keyword planner tool, click here and sign-in with your Gmail account and follow the links and click -Tools--Keyword planner as below image.
Type your keyword, landing page and product category, which is whether your website is related to education, arts, health etc.and click Get ideas button.
Once you find the keywords, if you haven't registered your website, ideally it is good idear to register related to specific keyword or combination of keywords in domain name.
And, finally choose the best keywords and type in meta tag as below image.In every different pages, keyword must be related to the contents of page.
Search Engine Optimization is just not the set of techniques that you follow in the beginning of web development time, but it is about creating fresh contents on regular basis too. Search engines love to crawl over the newly written articles. So, you need to update and upload fresh qualitative contents.
Every web page will have anchors or links. It could be a word or group of words you may need to link to another web page or part of webpage. The ideal practate meaningful and creative links.
Creating sitemap is one of the important job for the perspective of search engine. Sitemap is collection of URLs.Sitemap of any website can be generated with Google and there are third party sitemap generators in various format.Once sitemap is ready, it can be submitted to Webmaster Tools.
Once download is complete, save it in you folder and upload the file on root folder via FTP client(Filezilla).
-And, go to Webmaster Tools page of Google(www.google.com/webmasters/) and click add a sitemap URL, click add a site if you haven't added your site in Webmasters Tools before. If you have already added site, it will already be listed and click the particular website, for which you have generated sitemap.
Click Crawl and click Sitemaps
-On right hand side, click ADD/TEST SITEMAP button
Now, type sitemap.xml and click Submit Sitemap and you will see Sitemap submitted message.
Search engines crawl websites every now and then according to some process. Google is one of the most powerful search engine.Googlebot is a Google's web crawling software also called spider and it finds new and updated pages and makes index.
Google use huge set of computers to fetch (or "crawl") billions of webpages and Googlebot works according to some algorithmic which specifies which sites to crawl, how often and how many pages to crawl from each site. Googlebot's crawl process begins with a list of webpage URLs, generated from previous crawl processes and augmented with Sitemap data provided by webmasters. As Googlebot visits each of these websites it detects links (SRC and HREF) on each page and adds them to its list of pages to crawl.
Googlebot starts crawling with a list of URLs created from earlier crawling and updates with sitemap submitted.It also detects src and href on each page and adds them too in list of pages to be crawled. Googlebot can be prevented crawling contents from certain sites, one of them method is use of robots.txt to block access to files and directories.It must be in correct location.
Web Robots (also known as Web Wanderers, Crawlers, or Spiders), are programs that traverse the Web automatically. Search engines such as Google use them to index the web content, spammers use them to scan for email addresses, and they have many other uses.
Robots.txt file to is used to provide instructions to web robots; this is called The Robots Exclusion Protocol. It works likes this: a robot wants to vists a Web site URL, say http://www.mileytech.com, before visiting it checks for http://www.mileytech.com/robots.txt, and finds:
User-agent: *
Disallow: /
The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.There are two important considerations when using /robots.txt:robots can ignore your /robots.txt. Especially malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention. the /robots.txt file is a publicly available file. Anyone can see what sections of your server you don't want robots to use. So don't try to use /robots.txt to hide information.
source: www.robotstxt.org