Search Engine optimization #1:Online Optimization-Generating Robots and sitemap file


Hello friends !! Welcome to our path HIGH-TECHDROID andthis is BharathKrishna. Today, we are going to about what is SearchEngine Optimisation( SEO ), its lists and the uses in this video. Lets get into the video. So, Before going into Search Engine Optimisation, tells have a look at website. A website comprises of both front end andback end. For example, Ive opened an applicationcalled Facebook. In this, user interacts to the browser withwhich called the front end. Now if I demonstrate my username and password, thebrowser pursuings and throws the research results. This is done with the help of an a supportcalled back end. Front end is designed consuming HTML, CSS, PHP, etc ., called as mark-up languages.Developing is done through C, C ++, Java andrecently even Python is being used. Next, there is a third concept announced DigitalMarketing. Digital Marketing simply intends, when we launchour brand-new website, consumers( ie ., patrons) do not have any idea about our website. So, we need promote ie ., tout our website. With the help of promotion, our customer levelgets increased. When used height increases, our website viewsand ranking goes increased. This is known as Digital Marketing. Now there is a concept announced search engineoptimisation. Search engine optimisation is the processof getting traffic from the search engines. So , now lets take Google. In Google, if we make traffic to our website, our viewers and users rate get increased. This is called search engine optimisation. In this search engine optimisation, thereare two categories. They include 1. Onpage Optimisation and 2. Offpageoptimisation. On-page optimisation comprises all the activitiesthat we do in our website to sing our website. To our website, whatever activities we doother than what we do in website is known as off-page optimisation.Under this off-page optimisation, precipitates theanother category announced Social Media Optimisation. In this social media optimisation, we givethe details of our website and promotion through idol submission, video submission or other such activities and publish our website in social media. This is Off-page optimisation in short. Let me show about off-page optimisationin detail further in next video. Now gives get into on-page optimisation. There are a lot policies in on-page optimisationand one of them is robots file. In our website, “theres been” private filesor confidential enters in our website to be protected from Web crawlers. Web crawlers are those which includes GoogleBot, the Google’s web crawler. Similarly, Bing, Yahoo and so have their respectiveweb crawlers. These web crawlers crawl our robots file from our website ie ., predict our website. In robots.txt file, the pages which doesntbe allowed to crawl are insisted. So these web crawlers do not crawl our robotsfile but creeping our website. Hence, robots enter is used for this purpose. Now let me show you how the robots file ogles. I take an application called WordPress asexample. WordPress.com/ robot.txt. on dedicating this, theparticular place whose robot.txt register you wanted to view is displayed.In this, User agent is given as *. Actually, this is given to show that user agent refers to the search engines.* Symbolizes’ for all’ that is all the searchengines has to follow the dictates of it. Hence, it is given as User agent =*. Next, This allow and disallow enables us to selectthe pages the crawlers can slither and cannot crawling respectively. In WordPress, the pages like next, activateshould not be slithered hence theyve vowed it in robots file. So next, the crawlers do not crawl the activatepage in it. Hence robot file is applicable for above intent. To engender robot.txt datum, go to search engineand type robot.txt file generator. So, click on the first connection. In this, there is option called Default-AllRobots are. Do not change anything in it.Next, what are the search engines that willcrawl are procured. Next, glue all joins of all your privateand confidential records in Restricted Directory. So that these file are put under’ Disallow ‘. So that web crawlers do not crawl the particular sheet. Next, click on’ Create robot.txt file ‘. Now our robot.txt datum is generated. We can glue this in our root file. Next is sitemap. This is our second concept. Sitemap is, for Google ie ., search engines, renders a direction to access our website or sheets. To hear the contents in the sitemap document. I go to the same application WordPress. willsee what they have done with the sitemap.sitemap.xml, Our sitemap will always be in xml format.we have to create it with the xml, so that in this sitemap page all our page tie-in willbe given in this sitemap page.What will be happen in the appreciation is Googlebotwill search our sitemap sheet and it crawl i.e, will read and too will read all linkin the page and understand it and then will accumulate it in its database. if you guys ask what will happen by storingit, If anyone examination i.e,( inquiry by our keyword) What will google do is, The link which alreadystored in its database will be shown as the result.so our views rate will be increased. Our website will be published for them, sothat our website ranking will likewise be increased. This sitemap register will generated by Going to probe and present sitemap generator, So on clicking thefirst connect move into the it( XML-SITEMAPS.COM ). Now by pasting our site connect and precisely givingstart automatically our sitemap record will be generated. This is only suitable for the website haveless than 500 sheets, If it is more than 500 pages by to it we can create the accountfor us. So that we can also generate for more than 500 page website too. Now I going to generate the sitemap for myblog. Precisely by giving start it will automaticallygenerate. so generated, When I afford view sitemap detailsAll my website attach will be given now We have to download it and paste the downloadedfile in our seed datum i.e,( in our website spring file ). We successfully generated the sitemap for our website. When WebCrawler search our website all ourwebsite tie-in will stored in its database and show to the users who are all searching. So today we have seen two strategies whatis Robot.txt file andSitemap file Now we create a new work i.e, brand-new sitenot only creating the sitemap datum too paste it in a webmaster implement i.e,( Google webmasterstool ). So that what will happen is google will automaticallystore our examines in their database. so observers will reach our website. Incase we havent create our sitemap forour website what happens is, then very will get reach what is? Our website will get reach slowly.If the sitemap is done website come reach somefaster than getting contact by not doing it. So this is the use of sitemap. And this becomes the end of the video willsee on next video, Thankyou ..

As found on YouTube

What's your reaction?

In Love
Not Sure

You may also like

Leave a reply

Your email address will not be published. Required fields are marked *