Search Engine optimization #1:Online Optimization-Generating Robots and sitemap file


Hello friends !! Welcome to our channel HIGH-TECHDROID andthis is BharathKrishna. Today, us to be able to about what is SearchEngine Optimisation( SEO ), its categories and the uses in this video. Lets get into the video. So, Before going into Search Engine Optimisation, makes have a look at website. A website comprises of both front end andback end. For example, Ive opened an applicationcalled Facebook. In this, customer interacts to the browser withwhich called the front end. Now if I commit my username and password, thebrowser research and demonstrates the results. This is done with the help of an a supportcalled back end. Front end is designed expending HTML, CSS, PHP, etc ., called as mark-up conversations. Developing is done using C, C ++, Java andrecently even Python is being used. Next, there is a third conception announced DigitalMarketing.Digital Marketing simply intends, where reference is launchour new website, customers( ie ., patrons) do not have any idea about our website. So, we need promote ie ., praise our website. With the help of promotion, our consumer levelgets increased. When user grade increases, our website viewsand ranking goes increased. This is known as Digital Marketing. Now there is a concept called search engineoptimisation.Search locomotive optimisation is the processof getting traffic from the search engines. So , now lets take Google. In Google, if we organize traffic to our website, our viewers and users rate get increased. This is called search engine optimisation. In this search engine optimisation, thereare two categories. They include 1. Onpage Optimisation and 2. Offpageoptimisation. On-page optimisation comprises all the activitiesthat we do in our website to aria our website. To our website, whatever pleasures we doother than what we do in website is known as off-page optimisation. Under this off-page optimisation, comes theanother category called Social Media Optimisation.In this social media optimisation, we givethe details of our website and promotion through portrait submission, video submission or other such activities and publish our website in social media. This is Off-page optimisation in short. Let me justify about off-page optimisationin detail further in next video. Now makes get into on-page optimisation. There are many approaches in on-page optimisationand one of them is robots file. In our website, there will be private filesor confidential documents in our website to be protected from Web crawlers. Web crawlers are those which includes GoogleBot, the Google’s web crawler. Similarly, Bing, Yahoo and so have their respectiveweb crawlers. These web crawlers crawl our robots file from our website ie ., speak our website.In robots.txt document, the pages which doesntbe allowed to crawl are insisted. So these web crawlers do not crawl our robotsfile but creep our website. Hence, robots enter is used for this purpose. Now let me show you how the robots enter inspections. I take an application called WordPress asexample. WordPress.com/ robot.txt. on giving this, theparticular place whose robot.txt file you wanted to view is displayed. In this, User agent is given as *. Actually, this is given to show that user agent refers to the search engines.* Means’ for all’ that is all the searchengines has to follow the requires of it. Hence, it is given as User agent =*. Next, This allow and disallow enables us to selectthe pages the crawlers can crawl and cannot creep respectively. In WordPress, the pages like next, activateshould not be crawled hence theyve contended it in robots file. So next, the crawlers do not crawl the activatepage in it. Hence robot file is used for above purpose.To generate robot.txt enter, go to search engineand type robot.txt file generator. So, click on the first tie. In this, there is option called Default-AllRobots are. Do not change anything in it. Next, what are the search engines that willcrawl are known. Next, paste all ties-in of all your privateand confidential folders in Restricted Index. So that these file are put under’ Disallow ‘. So that web crawlers do not crawl giving particular page. Next, click on’ Create robot.txt file ‘. Now our robot.txt folder is generated. We can paste this in our root register. Next is sitemap. This is our second concept. Sitemap is, for Google ie ., search engines, affords a direction to access our website or sheets. To learn the contents in the sitemap folder. I go to the same application WordPress. willsee what they have done with the sitemap.sitemap.xml, Our sitemap will always be in xml format.we have to create it with the xml, so that in this sitemap page all our sheet connection willbe given in this sitemap page. what will be happen in the sense is Googlebotwill search our sitemap page and it crawl i.e, will read and likewise will read all linkin the sheet and understand it and then will accumulate it in its database.If you guys ask what will happen by storingit, If anyone examine i.e,( examine by our keyword) What will google do is, The link which alreadystored in its database will be shown as the result.so our views rate will be increased. Our website will be published for them, sothat our website ranking will likewise be increased. This sitemap enter will generated by Going to research and contribute sitemap generator, So on sounding thefirst connect move into the it( XML-SITEMAPS.COM ). Now by pasting our site associate and really givingstart automatically our sitemap file will be generated. This is only suitable for the website haveless than 500 sheets, If “its more than” 500 pages by paying to it we can create the accountfor us. So that we can also generate for more than 500 sheet website extremely. Now I going to generate the sitemap for myblog. Merely by devoting start it will automaticallygenerate. so engendered, When I demonstrate view sitemap detailsAll my website join “il give” now We “re going to have to” download it and glue the downloadedfile in our spring file i.e,( in our website seed file ). We successfully rendered the sitemap for our website. When WebCrawler search our website all ourwebsite relate will stored in its database and advocate to the users who are all searching. So today we have seen two strategies whatis Robot.txt file andSitemap file Now we create a brand-new lotion i.e, new sitenot only creating the sitemap datum likewise glue it in a webmaster tool i.e,( Google webmasterstool ). So that what will happen is google will automaticallystore our explorations in their database. so observers will reach our website. Incase we havent create our sitemap forour website what happens is, then very will get reach what is? Our website will get reach gradually. If the sitemap is done website do reaching somefaster than going reach by not doing it. So this is the use of sitemap. And this becomes the end of the video willsee on next video, Thankyou ..

As found on YouTube

What's your reaction?

In Love
Not Sure

You may also like