Business

English Google SEO office-hours from January 29, 2021

caption

all right welcome everyone to today’s google search central seo office hours hangout uh my name is john mueller i’m a search advocate at google here in switzerland and part of what we do are these office hour hangouts where people can join in and ask their questions around search and we can try to find some answers a bunch of things were already submitted on youtube so we can go through some of those but if any of you want to get started with the first question you’re welcome to jump on in hi john i hope it’s okay if i start i have a quick question regarding core web vitals does it make a difference if a corvette vital is below the the lower threshold or is between the lower and the upper threshold what i mean is if the corvette vital is yellow or green does it make a difference for the ranking uh beginning in may um i don’t know if we we’ve announced anything specific around that my my understanding is uh we we see if it’s in in the green and then that kind of counts as it’s okay or not uh so if it’s in yellow that wouldn’t be kind of in the green um but i i don’t know what what the final approach there will be because there are a number of factors that come together and i think the the general idea is if we can recognize that a page matches all of these criteria then we would like to use that appropriately in search for ranking um i don’t know what the approach would be where it’s like there’s some things that are okay and some things that are not perfectly okay like where how that would balance out okay will there be some kind of uh information uh before may about this i i suspect so yeah i mean the the general guideline is that we would like to use these criteria to also be able to show a badge in the search results uh which i i think there have been some experiments happening around that and for that we really need to know that all of the factors are compliant so if it’s not on https then essentially even if the rest is okay then that wouldn’t be enough okay thanks sure uh good morning john i had a question in regards to knowledge panels uh i’m working on a client site and i wanted to see what steps i can take to get a knowledge panel generated is there certain schema certain things i have to do to make that happen um we don’t have any guidelines for how to like enable a knowledge panel essentially that’s something that our algorithms try to pick up algorithmically automatically and that’s something where we take into account a number of different sources of information to try to figure out what what are the entities that are associated with this page and how relevant are they how should we be showing that in search uh so it’s not that there’s a specific meta tag that you need to do you or a specific type of structured data that you need to add it’s more that everything kind of needs to to align so that we can really recognize this page or this site is about a specific kind of entity okay no fantastic answer there or no straightforward answer um i i know there are some people uh outside of google who have been working all around kind of the the knowledge panel of things yeah andy just linked to jason bernard uh he’s he’s definitely totally on top of this and he can probably give you a lot of tips on things that you can kind of work to make a line okay thank you i think someone raised his hand hi john uh i have two questions if it’s not a problem and uh the first one regards the duplicate content uh say we have a page that describes a bunch of cars right so certain parts of the description and content is the same it is unique we wrote it and it’s an information that users will find helpful but how will google look at that will it be some sort of negative score or something well okay so with with that kind of duplicate content it’s not so much that there’s a negative score associated with it it’s more that if we find exactly the same information on multiple pages on the web and someone searches specifically for that piece of information then we’ll try to find the best matching page so if you have the same content on multiple of your pages then we won’t show all of these pages we’ll try to pick one of them and show that so it’s not that there’s any kind of negative signal associated with that in a lot of cases that’s that’s kind of normal that you have some some amount of shared content across some of the pages yeah uh for example is with with e-commerce if you have a product and someone else is selling the same product or within a website maybe you have a footer that you share across all of your pages and sometimes that’s a pretty big footer and technically that’s duplicate content but we can kind of deal with that so that’s that shouldn’t be a problem yeah great so the second question would be about languages so we have um say english and spanish and we have the same article but it’s translated and we have great rankings in say spanish speaking area and will that good ranking work for us so we get a better ranking in the u.s or english-speaking area not automatically so we we treat these as different pages and we we will try to rank them individually however what what usually is done when you have a localized copy of your content is that you link between the localized versions so you would link from the english version to the spanish version from the spanish version to the english version and based on these links we would be able to distribute like some of the signals associated with that good page with that new language version that you also have so it’s it’s not like there’s an automatic kind of your page in english will rank just as well as your page in spanish uh but some of that effort that you put in if you link between those versions that will will be associated there it’s also the case that sometimes the competition in different languages is just very different so you might have a very strong page in spanish and the english version is a much more competitive market then even if we forward some of the signals to your english version it might be very different with regards to the the competition in the search results okay thank you very much hi john uh i have two question about uh video so when we add an image to a website we usually use alt tag we use title tag and caption so that google can understand what is this image about what can we do for video if i upload or embed a video in my content what can i do um i i don’t know the names of hand but for video we do have a type of structured data that you can use and it also has fields for things like descriptions and title i think for videos as well so that’s something you can definitely use but the the more practical thing is kind of like with images as well if you have a caption right next to the video if you have a heading on that part of the page uh all of that kind of applies to the embedded content as well so that’s something that doesn’t always require special markup or special attributes to use uh second question is uh is there any difference uh uploading a video or embedding a video from seo point of view like if i embed a video it means i am pulling the video from other website when you but when i upload it it means it is on my website it means it is my content so does google differentiate in this way or it’s always the same from as your point of view it’s essentially the same like it’s it’s very common that you have a separate cdn for videos for example and technically that’s kind of a separate website and from our point of view if if that works for your users if your content is accessible properly for google for indexing then that’s perfectly fine thank you a bunch of people also have their hands raised [Music] john just wanted to follow up a bit on that duplicate content question um obviously what you said is uh perfectly accurate but i think there are some cases where uh google doesn’t automatically merge two pages together for various reasons doesn’t have enough signals or signals are conflicting so in that case both pages might actually compete among one against each other and in that case it kind of while there’s no specific penalty or any algorithmic factor to kind of pull the side down you’re still uh at a disadvantage because you’re trying to compete for the same thing with two pages so i guess that it’s it’s still worth to try to make sure to set up signals so they do actually canonicalize properly without hoping that google kind of uh picks that up automatically yeah definitely yeah john can i just clarify that the video question sure i i thought unless i’m my knowledge is a couple of years out a day if you used youtube as your kind of host for the videos and you did a google a video search of things then youtube would be the the source and not your page whereas if you use vimeo or something else then actually your page shows as the source is that not still the case it depends ah actually it depends the the standard answer uh sorry uh but i i mean with with youtube you essentially have two video landing pages you have the landing page on on youtube and you have the landing page on your site and we we kind of have to figure out which one of these pages to show and it can happen that we show your site as the the video result landing page just because we have more information there perhaps it can also be that we show the the youtube landing page because maybe we have more signals or more information there so that’s something where it’s not automatically the case that we would show the youtube landing page and some some other video platforms also have their own landing pages that they create automatically some video hosting platforms don’t do that at all essentially that’s that’s kind of up to you there so essentially by if you do it on something like vimeo which doesn’t have a public facing all the difference is you’re not creating a competing video you create when you upload it whereas with youtube if it’s public you’re creating your own and the competing youtube version and therefore yeah you’re competing with youtube right yeah i i guess that’s one thing that could play in into that uh it’s also i mean people sometimes search for videos on youtube so if you were there that that might be kind of something you’d want to do anyway but yeah it’s it is something where you you kind of need to think about that a little bit as well and it’s not like 100 the same if you have uh the the same video file that you’re hosting yourself or that someone else is hosting for you without a landing page or like you essentially have two video landing pages uh sometimes you also want to have two video landing pages because like slightly different content or attracts slightly different audience okay okay hi hi john uh can i ask something sure okay so for example there is a website that is getting a good amount of search traffic now um i just wanted to know if the same website is getting traffic from some other channels like direct traffic social traffic or some traffic from other blogs so that non-search related traffic that a website is getting is is some kind of positive signal for google like uh like people find that this website is getting a lot of traffic from uh direct or social channels so we should push uh this website in search also so that search users can find it easily we we don’t use that for seo so if like i i think it’s great to as from from a website point of view to diversify the different sources of traffic so that you have different places where people can find your site different places how people can get to your site but we don’t use that for search um so if if you’re active i don’t know on facebook or on other social media channels you get a lot of traffic there then that’s that’s kind of a way to balance out some of the uncertainty maybe around search or around discover or around some of the other channels that you also use sorry so kind of i think that diversification makes sense it’s something to to work on and it helps to make it a little bit safer with regards to your presence online but it’s not something that we take into account for search not directly or indirect like for example a brand it suddenly starts getting a lot of direct traffic so there’s no impact on search performance no no i mean indirectly you might see something if people go to your site and they think it’s fantastic and they share it with other people and we pick that up as a link then maybe we could take that into account but not directly sure sure thank you okay i see a bunch of people raise their hands and i just found a place where i can see the names so um i’ll just go through from from the top uh of my list here benjamin hi john how are you i really long time listener first time caller it’s an honor to be here i remember the matt cutts days but um thank you for letting me in um i’m writing it or i’m calling it because i purchased the website that has been around since 1998 it was owned by the same person until 2018 and can i paste it in the the am i allowed to paste it in the chat sure okay um it got purchased and then from what i can tell from all my seo research it got possibly turned into a pbn potentially or just a really bad website i purchased it and i’ve done everything i can to turn it around i got in touch with the original owner stocked the internet found him been emailing him back and forth asked him why he sold it he just said he got too old for it um updating all the content added new ui added social profiles which didn’t exist i’m seeing slight movements in traffic which leads me to my question i’m investing a lot of time into this you know money too but it’s just really my time and i’m putting my heart and soul into it because i do think that this site needs to exist i haven’t found much on this topic am i wasting my time like does google say oh this was maybe potentially a pbn or they did spam me back oh i did disavow as well um am i wasting my time like can you talk to the point of like are sites able to be turned around especially under new ownership new who is information new domain new host as well yeah um i i think the short answer is yes you can turn it around in in general the the most common issue that we find is that maybe there’s a manual action involved and that’s something you would see in search console and i assume yeah i assume you checked yeah uh and if there’s no manual action then essentially it’s kind of a normal website it’s not that we would take into account any any previous manual actions like if if some owner in between had a manual action that’s not something that we would take into account so essentially the way the way the site is now is the way that our algorithms are looking at it at the moment and if you work to improve the site and if there are other issues associated with the site that you work to improve then that should improve over time there’s no black stamp on it forever type of thing yeah okay yeah cool all right thanks sean andrew let me just run through the names as i have them in the list and then after that i’ll try to get through some of the submitted ones as well but just to kind of make sure that we follow like who all is in the list yes john hello thank you uh well my question is about the situation when a site has an app connected for this to this site and for mobile users browse browsers offer them to download the application so showing some kind of banners from google play or app store this offer as you may know appear at the top of the site and inevitably it creates layout shifting and it’s also hard to fix this shifting and i wanted to ask uh what is your opinion on this matter is google okay with uh these situations um i i i assume you’re specifically asking about the cumulative layout shift from the quarterback sure yeah yeah so for that i i would focus on the metrics that you can pick up from the testing tools and from the real user data that you can pick up for the site so it’s it’s something where we don’t explicitly say this specific kind of pop-up or banner is okay or this specific banner is not okay it’s really just you can test it and see what number comes out and maybe there are things that you can do to improve that so that the number looks better and where you can also look at the number and say this is okay for me or this is something i need to improve uh well but they do create this shifting and well i don’t know it’s not uh you can’t control them because it’s not on your side i mean uh as i understand uh it’s not uh well it’s uh browsers they just offer those banners i i i can like uh uh i don’t know just close them i mean for for forever but uh well yeah an option uh as well a good option yeah i i i don’t know how exactly that that look okay on a browser so it’s hard to say but it’s it’s something where if you see that this is a problem for some of your pages maybe there are ways that you can do to make it less visible on the important pages and maybe limit it to other pages or do something such as allow the user to to click on a button and then trigger this in the browser something along those lines okay okay thank you sure um lee i don’t know if you asked before or already or still hey john um real quick just just two quick questions here um piggybacking on the question about page speed earlier in core web titles um i work with a company and they have a very authoritative page we’re more authoritative than our competitors that are outranking us for a number of keywords and that has to do with some other factors that i’m dealing with right now but we do have a slower page significantly however i would and i would say objectively we have a more valuable experience we offer a better product that sort of thing and we get more traffic than these other uh pages that we’re ranking against so my question is is page speed used now as a significant ranking factor that would outweigh the content that we’re providing and user experience and that sort of thing and will that get to be more of a factor come may and that sort of thing when core web vitals rolls out uh we use speed on on mobile at the moment so that’s something that kind of plays in a little bit there and uh in may i i think the the idea is to kind of revamp how we look at that and maybe i don’t know my guess is that we will use that a little bit stronger as a signal however it’s still the case that when we can recognize that someone is looking for something specific or a page is particularly relevant to users then that’s that’s what we plan to show so for example if someone is looking for your company we wouldn’t show some other page just because it’s a little bit faster and that kind of means that if you have a really strong page in your topic area then probably that that will be okay even if it’s a little bit slower but yeah it’s really hard to kind of say like this is where the line will be drawn and this much is like not strong enough or anything like that cool that makes sense we’re working on a number of things so i’m not super worried about it but it is one of those things where you know our engineering team is sort of backed up and working on getting these we have a very large website so that’s obviously one of the things that we’re working on my second question is about just subdomains so my company is a blog that operates on a secondary subdomain as opposed to a subfolder and i’m just wondering as far as ranking signals and stuff from google is that viewed as another domain entirely or do we have authority that’s going back and forth between the two obviously we’re linking between the two effectively and stuff it depends uh so in in some cases we will see this as part of the main site in some cases we’ll see it as something kind of separate so for for the most part i i wouldn’t recommend just changing this just blindly and hoping that you’ll get like a big big advantage out of it but rather if you’re thinking about consolidating things on one domain if it makes more sense from kind of an infrastructure a tracking point of view then i think all of that those things kind of point more in the direction of using subdirectories but sometimes it’s also the same that kind of applies to sub domains where you say infrastructure wise it’s easier like this maybe you’re using some cms that needs to be on a separate subdomain uh all of those things so how does it all right thanks so much john sure antonio hi uh it’s an honor to ask you actually thank you so much question uh i have a page that’s ranked for dozens of words in the top positions and for a search or a group of source i’m hunted for the second page of the results the best thing to do is to include uh these terms and talk about it on the same page or create a specific page for those terms and link to do it i have a good and bad er experience with both options and that makes i’m really dope for uh what’s your recommendation if you want i i can do uh example for explain better my question yeah i i i totally understand the question and i don’t think i can give you an absolute answer because like you said sometimes it makes sense to separate things out on multiple pages sometimes it makes sense to concentrate on on a single page i think it depends a lot on the the user expectations maybe also the competition and with regards to like would would users be confused if they went to one general page and they expect a kind of a specific page um or would users also be okay with a general page that is ranking for the most part i think fewer pages make sense because you can make stronger pages uh they’re a little bit stronger with regards to kind of the competition but at the same time if you’re specific about a certain topic then that’s a very well targeted page so finding that balance i think is really hard my my usual recommendation is for you to try it out and test it out and see what what works well for you in your specific situation thank you so much no i i mean it’s not not like a perfect answer but i i hopefully give you some ideas at least there’s no absolute answer i i know i have this problem many years and uh i will not know is actually to do because it’s google uh thank you so much you are incredible thank you stefan good nothing um so i let me set the stage real quick i work for a fairly large company and i have similar uh subdomain duplicate content challenges as li does um it’s not exact duplicate content because the context is slightly different there’s a purchase aspect on one sub domain and a stream aspect on on a different one so we’ve been trying to tackle some indexation issues where one subdomain actually has a a lower traffic potential we rank typically less well and so forth so as as i’m looking through gsc um one of my site maps looking through the excluded selection and so for one of the details the duplicate submitted url uh not selected as canonical there’s a huge number of urls with that flag and so as i started to look into some of the examples and then use the inspect url i noticed that almost invariably there was no url included in the google selected canonical so i submitted a help article on cut on the community and i got kind of a boilerplate answer and i was looking as i was listening to as we started the call and it looks like that issue is fixed so invariably or it seems like the the urls are now populated but most of the examples that i’ve found are completely off base um so they’ll they have nothing to do with the actual query that the page should rank for so um so my question is i guess the first question is how should one interpret uh the google selected canonical n a as a value um i don’t know i i think i saw something similar uh on on twitter recently and i think that was a bug on our side uh so in particular if we index a page then we always have a canonical and if the the tool says there’s no canonical available but it’s indexed then that would be kind of something conflicting on our side not necessarily something actionable on your side on the other hand if it’s not indexed then it would be kind of normal that we say well we we don’t have a canonical associated with this url so it’s because it’s not indexed essentially yeah well the the report i’m looking at would be index pages only right um or at least i’ve confirmed that they’re indexed okay um uh since we can uh we configure gsc for separate subdomains my my hypothesis was that the alternative subdomain was actually ranking and you would just simply not include that url because it’s not on the same sub demands connect configuration no i i think i mean i i don’t know what what the specific bug there there is but uh it it should be the case that even if it’s on a different domain if we say like a different url on another domain is canonical for this one that you have that we should show that uh so it shouldn’t be the case that we we would just say oh we don’t know but actually we do know but we just i don’t want to tell you about it so that seems more like like a bug on our side rather than a sign that like we we use a different sub domain for that okay um well it looks as though the bug may have been fixed but you may have introduced a new bug oh no okay can i seem completely off base from the topic that um should be canonicalized okay um if you can drop some examples maybe in the chat here i can pick that up afterwards and pass it on to the team can’t do thank you much sure thanks angie i think you also have your hand raised hi john as always thank you for doing this um so i’m doing a site move for a client um and i had a few questions about that so we’re changing the domain name so they they want to kind of present the brand as more modern so just changing it to the acronym of the full name um and we’re not removing any pages changing any content on pages or anything like that so it’s going to just be doing the redirects and then using the move up the change of address tool and search console and etc so pretty straightforward but um i was wondering if everything should be redirected so like the sitemap and robots.txt for uh specifically um like is it recommended to leave both the old site map up and the new site map up so that google is able to find all the uh redirects to the new site yes okay and then for the robots um i saw that there was in the crawl stats support document i believe there was um it said it said that like 200s and 400s were successful robot responses and 429s and 500s were unsuccessful did not mention anything about redirects so is it recommended to you know leave that not redirected so that google is still able to crawl from the old site or um i i don’t think it matters with regards to the redirect uh i i think in our site move documentation at some point we had a recommendation that you have an empty robot’s text file on the old domain just so that we can crawl all urls and then find the redirect to a new one and recognize oh it’s it’s now blocked or it’s a 404 or whatever um my my guess is that’s more of kind of like super small optimization with regards to crawling and not really necessary so in particular cycle file and robots text files they’re they’re files that are more like control files and not indexed anyway so if they do or do not redirect is it it doesn’t really matter uh with regards to all of the indexable content that is something that is is kind of important for us in the sense that if we can recognize that a site is only partially moving then we’ll kind of have to shift into a mode of like oh we have to figure out what actually changed here whereas if we can recognize the site has completely moved all of the indexable content has moved then we can shift into the mode of like oh we’ll just transfer all of the signals in bulk to the new domain okay yeah so we are doing um like a one-time switch so we’re not moving in sections so it’ll just literally be like flipping the switch um and i guess the signals will be started we’ll start processing will be processed right away yeah okay yeah ideally yeah okay and then actually on the uh note of like site signals because i know that um these days it’s you know we talk about how pages are being ranked rather than domains as a whole but how much of a factor are the site signals um like what i’m trying to ask is basically say everything say we do everything you know perfectly we’re not changing anything on the site either so everything should be pretty much the same once google’s finished reprocessing everything can we expect to have essentially the same sort of performance and everything as we had before it’s going to be on the same server as well so the only thing that’s changing is essentially just a domain name um or you know what about new pages that go up on the site after the the change um because it’s a new domain that has had no previous history could new pages be a little bit at a disadvantage because google has less site signals or um i guess basically how much site how much of the site signals have to do with that i i think in an ideal situation all of that will just transfer and i’ll just continue working just like before so we we’ve been doing a number of analysis internally on site moves in particular just to make sure that like we’re not missing anything that nothing kind of goes wrong with with most of the site moves and for the most part we we can see that it clearly just completely transfers to a new domain uh so if if there’s nothing kind of weird associated with a new domain and the the website really does a clean move then that should be completely fine i think it’s it’s always something where you don’t completely know what to expect because it’s it’s a different domain you can’t really try it out ahead of time so it’s always a little bit of uncertainty involved so my recommendation there is just to make sure that you’re really tracking all of the details that you have a big spreadsheet with all of the the checklist items to to double check so that should anything weird happen you can kind of be certain that we have all of these basics covered and it’s really i don’t know something kind of obscure that you couldn’t have known ahead of time or otherwise you can go through the list and see oh i forgot to set the href laying or i forgot the specific thing and then you can explicitly go in and fix that okay perfect um and then my last question was um so in the site move documentation it does mention that we should expect rankings to fluctuate um and i was just uh kind of thinking about like the process of how um google would reprocess those pages and how that would actually affect the ranking so for example like is it more uh you know if i have a page that on the on the old site that’s currently ranking for position one on a certain keyword um if we do the re when we do the redirect when google discovers it is there a period before um i guess the crawling and sort of re-indexing happens where it would actually drop out of the search results um before being replaced by the new page if everything is good or is it if it crawls it and reprocesses it immediately um or sorry or does the old page just stay there until the new page is kind of reprocessed yeah um so essentially what happens there is we switch to canonical so we would have the old page index we would start seeing the redirect we would follow the redirect see that the same content is there and then our systems would say oh it looks like it moved to this new url and we would just switch that over it’s not the case that it’s like it would fall out first and then be re-indexed again it’s really kind of like oh we see this connection we see both of these pages we can just switch it over to the new one so it shouldn’t be the case that there’s like a hole with regards to traffic but usually with with all site moves you have this period of there’s some things associated with the old sites some things associated with a new site and it takes a little bit of time to shift the majority of things over in a lot of the site moves that i looked at for for double checking this is i don’t know a period of a couple of days maybe a week or so where it just kind of like fluctuates a little bit until it settles down again in kind of a similar state with a new domain okay thank you very much sure um we we have tons of stuff submitted as well but it seems like i don’t know with the the hands and and such it seems to be working fairly well so i’m going to focus on questions from here and maybe add some comments to the questions that were submitted so that those don’t completely get lost all right i think uh kadu if i get your name right yeah yeah it’s right thank you john for this opportunity and my question is more about web vitals because we have an education platform and many of our requests are made from logarithmic users and basically we treat you like the users different from our library users and the same page we will load a bit more stuff on a little more steps for longer to users and this makes the page much more deficient on web vitals and my question is how concerned i have to be about my luggage users in this page that we serve for them um if if you have the same url that is publicly accessible as well then it’s is very likely that we can include that in kind of the aggregate data for core web vitals kind of in in the real user metrics side of things and then we might be counting that in as well so if the logged in page exists in a public forum that we might think some users are seeing this longer page perhaps or kind of more complicated page and we would count those metrics whereas if you had separate urls and we wouldn’t be able to actually index those separate urls then that seems like something that we would be able to separate out i don’t know what what the exact guidance here is from from the core web vitals side though so um i i would double check specifically with regards to chrome in the uh crux help pages to to see like how that would play a role in your specific case thank you john uh darcy john how’s it going wonderful good pretty good um okay so i have a question kind of along the lines of some of the sub domain questions that people have been asking um so um so if because you kind of said like i think if i heard your answer correctly there might there is some times when that can make sense so i’m wondering if this situation makes sense and what are the potential risks involved here where um the the site is going to exist in in two parts one is like a the content-focused part of the site and then the other is the e-commerce focused part of the site and so you would have you know shop.mysite.com and then your e-commerce site on on the subdomain um would you recommend a couple questions would you recommend that strategy what are the risks involved and then how much gamestop stock do you own oh man yeah i don’t know about the last one that’s so weird such a weird situation uh throws me off completely yeah um yeah um so i i think to kind of back up a bit uh we we regularly talk with the search leads about subdomains versus subdirectories and tell them like oh seos are so obsessed about subdirectories and such and they always tell us like they should just use whatever makes sense like our system should be able to deal with subdomains and subdirectories essentially in the same way so i i think there’s some more second order effects that some people are seeing there but i don’t think it’s something where i’d say like you will automatically have a bonus if you go to subdirectories or if you go to sub domains kind of things so if you have your shop on a separate subdomain and that’s what works well for you with regards to tracking and all of that then i i would try to keep that i i don’t see a problem with moving that over or trying to find a way to do like a reverse proxy and move that to a subdirectory so you you think you think they can because i mean i guess always the the concern right is the two separate site situation and and trying to rank because you want both to rank et cetera you know the problem that we have here is a technology problem and whether the customer wants to um use shopify for the ecommerce side but not for the content side right i think shopify can’t install on a subfolder um so uh you you you feel that it still could work just fine uh would google decide to link those and consider them one site are they always gonna be considered two sites no matter no matter what it it depends a little bit i i think in a situation where you just have two sub domains then probably we would treat those as separate sites uh if you have a lot of different sub domains then we might see like oh like they’re using wild card sub domains as categories for example then that would be a clear sign that actually this is one site we should treat that as one thing i think in a case where you have content on one side one sub domain and the shop on the other i i really don’t see a problem also with like competing against each other because usually people come with one intent and if they want the product they’ll find the product pages if they want information they’ll find the informational pages it’s not that these are competing against each other or otherwise kind of like annoying each other in search okay cool thank you what was the other question the game stopped oh okay okay all right thanks eric thanks um can you hear me yes okay hello uh hey thanks for having me and thanks for doing these sessions it’s great it’s really good i have a few questions regarding search um okay so first of all uh uh we had an uh we had a review of of iphone that came out and uh first for some reason it ranked lower than our first impressions article that we did that covered uh reviews from other magazines uh all over the world but you know we are really really strong locally in our country and we had a review like really really good review and it was it was uh below our article that covered the first impressions uh after like one or two months we we noticed that we were missing the review uh schema uh with the rating and and stuff and the price uh so we added that uh subway this sorry i added this to the article and uh and then after a couple hours it ranked uh above the first impressions article that we did before is that possible i i don’t think that would be directly related so in particular the the rich results or the the structured data is something that we use for the rich results in the search results it’s not something we would use as a ranking factor itself uh what might have happened is that you changed the page enough that our system said oh we need to reconsider how we kind of index this page and then based on that we kind of reconsidered the ranking but that’s that shouldn’t be related to adding structured data to a page uh essentially so it’s not the case that every review needs to have review structured data on it that’s exactly that’s what i thought okay so uh and uh also with this uh do the missing fields uh you know in schema org uh the in the review type to the missing field like brand sku uh description aggregate rating and isbn uh do these matter if they’re missing because uh i see that google only requires the three the price the name of the product product and uh that’s what i forgot uh rating yeah um so if you have the requirements covered then we would show that in the rich results and the rest are really more optional it’s not at least as far as i know it’s not the case that we would rank things more if you have more of the fields filled out okay okay and do that as good as google uh i know that you and search results that google only uses uh like you know all versions that that find content on the site but do the meta meta tags description and keywords matter at all for results um we we don’t use the keywords meta tag at all so that like you can do whatever you want with it i i don’t think any search engine uses that anymore the description we use as as a guide for the description or the snippet that we show in search it’s not always purely from the description meta tag but for for a lot of cases it is something that we use there so if you have a specific snippet that you want to have shown then definitely make sure that’s in the description but we wouldn’t use that as a ranking factor it’s more that you’re showing users what this page is about and maybe they will click through more if they understand it a little bit better but it’s not that it would rank higher because of that yeah i see all right uh and we received an email uh actually yesterday and today for our web page uh for search console that you launched a new service or something that grouped all the subdomains and subdirectories and http and https into one domain and so we you know verified the domain and everything and uh we still don’t see the data that we see for our https www website will this show up eventually yes the the data sometimes takes up to a week to be completely visible in search console so some features a little bit faster some features take a little bit more time but that’s kind of normal that it takes a little bit of time to be visible and i think with this message in particular we sent it out when we noticed that sites are not completely verified in search console that maybe you have traffic to the https version and you have http verified uh that’s something we we just wanted to make sure that people are are aware that they’re not looking at the full picture so with the domain verification that’s all kind of covered automatically i see perfect and we also noticed that our sitemaps were not indexing properly or you know loading properly in search console and it was fine for for a few years i mean you know no problems but uh like last week we noticed that for some reason uh the search console couldn’t load our uh site maps from the sitemap index we have an index file you know that with uh with all the sitemaps the parts and it loaded uh most of them but the ones that matter the most the posts they just couldn’t load it so is that something we can like you know do something about um i i would probably post about that in the help forum with a specific sitemap url so that someone can take a look there and if if it is something that’s more on google side then usually the folks in the help forum can help to escalate that to us i think i think it’s a it’s a bug because we added some some parameters into the url and it loaded fine so it shouldn’t be you know better yeah obviously okay and then last question sorry for that uh then we had a review for airpods max we’re the first in the country to have this both the product and review and i just don’t know why we still don’t show up in the results and it’s like really annoying you know because we spend a lot of time and effort into the produce view and we don’t show up in results so i don’t know can i sort it out yeah i i don’t know i can’t make you rank automatically higher so the the one thing i would watch out for is if if the page is indexed or not if it’s not indexed then usually that’s more of a technical thing that you can work to improve but if it is indexed and it’s just not ranking the way that you want um it’s it’s really hard for us to kind of like say this is a big problem or not a big problem so what what i would also do there is maybe post in the help forum and uh kind of the specifics of the url that you’re using uh the queries that you’re looking at especially if you’re looking at a specific country maybe those details as well and they they can take a look at that there and escalate that to us if if appropriate but a lot of these ranking questions are really tricky because it’s it’s often like which page should we show it’s like everyone wants to be first of course okay thank you very much thank you and and have a good day all of you thanks all right uh mark yeah hi thanks john um this is more like a like a european focus question um we have a cookie content layer obviously it’s iab conform uh sending out the correct strings and everything so my first question was i think if i’m not mistaken google mentioned that loading layers like these will be recognized and they will not be added to the page speed index however when i do check the domain and the page speed index inside tool it is recognized and it’s meant or it’s found to be improvable however we use probably the fastest cmp on the market so that’s really nothing to improve here um just a question will will go to go back and take that out of consideration also with reference to may or what do you think essentially we’d be able to take that out with regards to indexing the content but from a speed point of view we would not differentiate between kind of like this kind of banner or any other kind of banner so essentially from from a speed point of view it is something we would see as something that users would see when they go to the page and uh that would be taken into account there so i don’t think at least as far as i know that we would have any logic to say oh this is a proper cookie banner therefore we will ignore that it’s slow or that it causes layout shifts well there’s there’s that list available on officially registered uh cmps right so we’re doing something that’s legally legally obliged and um yeah we’re just wondering because in theory a competitor who’s not using any of those layers might have a page speed advantage and when php is com becoming more and more important this just you know more like out of curiosity how you deal with that now my my understanding is we would see that as a part of a page so that would be something if it’s slowing down the the loading of your page then that would be something we would take into account and say this page is slower than maybe the same page without that specific setup but i i think the another aspect that kind of plays into that as well is that you’re competing with sites that have similar setups uh so it’s less a matter of like we will not show your site at all in search uh but essentially other people have the same kind of struggles and need to figure out the same kind of thing so it’s kind of like you’re competing with other sites that have the same problem so it makes it a little bit more even but obviously in the situation that you mentioned where if someone is let’s say i don’t know let’s call it rogue and doesn’t do any kind of banner at all then that would be something that we would say well maybe this is a faster page but maybe the other page is a little bit slower but it’s a better page or better fitting page for the user and then we would still rank that there so it it’s something where we we do take speed and usability and to account for rankings but it’s not the only thing it’s like the content is still by far i think the the biggest aspect there okay thank you very much sure uh raymond let’s see hi john um so i also posted this question to the comments um for the video um we have a site with a mega menu that has over 1 000 links um it used to be that this mega menu back in 2018 would only load upon user action so they would you know when the user hovered above the uh nav bar uh majax called the logos links at some point in 2018 we added those static links and and you know again i know correlations and causation but around that same time we saw a big drop in our search traffic now we are now contemplating removing these links from our navbar as static links and going back to links that only load upon user action with ajax call we are nevertheless retaining a clear path to all of these links on relevant pages so we’ll still have a clear crawl path to those links but only on relevant pages rather than having these links on every page you know um you know thousands mega manuals and everything so we’re wondering what would possibly what could possibly be the impact the ramifications of removing these 1000 mega manual links of static points even though we are still retaining a crawl path to all of these leaks on relevant agencies yeah i i think on the one hand it’s kind of hard to say because i don’t know how how the rest of your site is structured so if for example these 1000 pages are all of your pages then it would be very different compared to say these 1000 pages are your categories and you have a million kind of subcategories or something like that um but in general what what you’re looking at from a change point of view is going from a more of a flat site structure to a more um i don’t know deeper site structure right i don’t know what the official name is you can use silo yeah um and that’s something where sometimes i can definitely make sense so it’s it’s something we we sometimes see um folks kind of obsessing about limiting the crawl depth for example and trying to make it so that googlebot can crawl to all pages in a very quick time and to some extent i think that makes sense on the other hand kind of more the top-down approach or pyramid structure helps us a lot more to understand the context of individual pages within the site so in particular if we know this category is associated with these other subcategories then that’s a clear connection that we have between those parts and that definitely helps us to understand like how these things are connected how they work together a little bit better whereas if it’s very flat then we think oh all of these are equally important and we don’t really know which of these are connected to each other uh so from from my point of view i think for a lot of sites it makes sense to have more of a pyramid structure but at the same time you don’t want it to be such that it’s like you have to click through a million times to actually get to the actual content you need to have i don’t know some some reasonable number of clicks essentially to to get to the content so there is a case to be made for having mega menus however we just wanted that not to show up as static links that would get crawled but you know still make it available to be used as through a user action for a user-induced action rather than just having them there are static links but we still are we’re all looking to have a siloed site a little more kind of a parametrical criminal structure uh rather than something that has where we have indiscriminately have a thousand links on every page um you know um so yeah we we’re you know we’ve been we’ve talked to a number of seo firms and they told us well you know if you do this then it can greatly negatively impact your site because all of a sudden these thousand weeks ahead you had all all these pages are gone and it’s not really true because are planning to retain those links but only on relevant pages uh but yet then you get and get rid of things don’t do it because if you if you lose all these one thousand links to all your pages it’s bound to have a negative effect and i’m wondering if an empirical statement like that could be made i i don’t think it would always have a negative effect i i do think if you make it too deep then that makes it harder for us to crawl and harder for us to pass the signals around but it’s not the case that like a super flat structure is going to be better than a a kind of reasonable pyramid structure so i personally i will try to aim for more of a pyramid structure just to make it so that it’s easier for us to understand the context of the individual pages and to forward the signals into kind of related areas easier um but it’s also something which is is it’s a very significant change on a site like that so it’s something where i understand that it makes sense to kind of get more input on on the options maybe even to test things out like you take one category and say i’ll try it here and see what happens with regards to crawling with regards to indexing with regards to ranking um all of these things because it’s it is quite a big step when you change your site from kind of like a super flat layout to more of a pyramid cool thank you um let me pause the recording here maybe for a moment i’ll be here for a little bit longer so we can still answer some of the questions that are remaining um but if you’re watching the recording thanks thanks for watching along i hope you found this useful and i’ll be setting up the next batch of hangouts probably on monday or so so for next week and the week after that all right thanks a lot and let me just pause

What's your reaction?

Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0

You may also like

More in:Business