Uncategorized

Search Engine optimization #1:Online Optimization-Generating Robots and sitemap file

caption

Hello friends !! Welcome to our direct HIGH-TECHDROID andthis is BharathKrishna. Today, we are going to about what is SearchEngine Optimisation( SEO ), its categories and the uses in this video. Let get into the video. So, Before getting into Search Engine Optimisation, tells have a look at website. A website comprises of both front end andback end. For example, Ive opened an applicationcalled Facebook. In this, user interacts to the browser withwhich called the front end. Now if I impart my username and password, thebrowser searches and renders the research results. This is done with the help of an a supportcalled back end. Front end is designed employing HTML, CSS, PHP, etc ., called as mark-up usages. Developing is done through C, C ++, Java andrecently even Python is being used. Next, there is a third hypothesi announced DigitalMarketing. Digital Marketing simply means, where reference is launchour brand-new website, useds( ie ., patrons) do not have any idea about our website.So, we need promote ie ., tout our website. With the help of promotion, our user levelgets increased. When consumer level increases, our website viewsand ranking get increased. This is known as Digital Marketing. Now there is a concept announced search engineoptimisation. Search engine optimisation is the processof getting traffic from the search engines. So , now gives make Google. In Google, if we form traffic to our website, our viewers and users rate get increased. This is called search engine optimisation. In this search engine optimisation, thereare two categories. They include 1. Onpage Optimisation and 2. Offpageoptimisation. On-page optimisation comprises all the activitiesthat we do in our website to chant our website. To our website, whatever activities we doother than what we do in website is known as off-page optimisation. Under this off-page optimisation, falls theanother category called Social Media Optimisation.In this social media optimisation, we givethe details of our website and publicity through idol submission, video submission or other such activities and publish our website in social media. This is Off-page optimisation in short. Let me excuse about off-page optimisationin detail further in next video. Now gives get into on-page optimisation. There are a lot approaches in on-page optimisationand one of them is robots file. In our website, “theres been” private filesor confidential enters in our website to be protected from Web crawlers. Web crawlers are those which includes GoogleBot, the Google’s web crawler. Similarly, Bing, Yahoo and so have their respectiveweb crawlers. These web crawlers crawl our robots file from our website ie ., speak our website. In robots.txt file, the pages which doesntbe allowed to crawl are insisted.So these web crawlers do not crawl our robotsfile but crawl our website. Hence, robots file is used for this purpose. Now let me show you how the robots file inspects. I take an application called WordPress asexample. WordPress.com/ robot.txt. on passing this, theparticular website whose robot.txt enter you wanted to view is exposed. In this, User agent is given as *. Actually, this is given to show that user agent refers to the search engines.* Represents’ for all’ that is all the searchengines has to follow the orderings of it. Hence, it is given as User agent =*. Next, This allow and disallow enables us to selectthe sheets the crawlers can slither and cannot slither respectively. In WordPress, the sheets like next, activateshould not be crawled hence theyve insisted it in robots file. So next, the crawlers do not crawl the activatepage in it. Hence robot file is used for above role. To produce robot.txt enter, go to search engineand type robot.txt file generator. So, click on the first link. In this, there is option called Default-AllRobots are. Do not change anything in it.Next, what are the search engines that willcrawl are detected. Next, paste all joins of all your privateand confidential data in Restricted Directories. So that these file are put under’ Disallow ‘. So that web crawlers do not crawl the specific sheet. Next, click on’ Create robot.txt file ‘. Now our robot.txt file is generated. We can paste this in our root file.Next is sitemap. This is our second hypothesi. Sitemap is, for Google ie ., search engines, accommodates a space to access our website or pages. To consider the contents in the sitemap document. I go to the same application WordPress. willsee what they have done with the sitemap.sitemap.xml, Our sitemap will always be in xml format.we have to create it with the xml, so that in this sitemap page all our page relation willbe given in this sitemap page.What will be had occurred in the smell is Googlebotwill search our sitemap page and it slither i.e, will read and too will read all linkin the page and understand it and then will collect it in its database. if you guys ask what will happen by storingit, If anyone exploration i.e,( rummage by our keyword) What will google do is, The link which alreadystored in its database will be shown as the result.so our views rate will rise. Our website will be published for them, sothat our website ranking will also be increased. This sitemap document will generated by Going to research and afford sitemap generator, So on clicking thefirst tie-in move into the it( XML-SITEMAPS.COM ). Now by pasting our site tie-up and time givingstart automatically our sitemap register will be generated.This is only suitable for the website haveless than 500 pages, If it is more than 500 pages by compensating to it we can create the accountfor us. so that we can also generate for more than 500 sheet website too. Now I going to generate the sitemap for myblog. Time by returning start it will automaticallygenerate. so produced, When I throw view sitemap detailsAll my website relate will be given now We “re going to have to” download it and glue the downloadedfile in our beginning file i.e,( in our website root enter ). We successfully made the sitemap for our website. When WebCrawler search our website all ourwebsite link will stored in its database and show to the users who are all searching.So today we have seen two strategies whatis Robot.txt file andSitemap file Now we create a brand-new employment i.e, brand-new sitenot only creating the sitemap enter too paste it in a webmaster tool i.e,( Google webmasterstool ). So that what happens is google will automaticallystore our inquiries in their database. so sees will reach our website. Incase we havent create our sitemap forour website what will happen is, then extremely will get reach what is? Our website will get reach slowly. If the sitemap is done website come reach somefaster than get reaching by not doing it. So this is the use of sitemap. And this becomes the end of the video willsee on next video, Thankyou ..

As found on YouTube

What's your reaction?

Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0

You may also like

Leave a reply

Your email address will not be published. Required fields are marked *