This is quite odd SERP - when I search "Google.com", It shows Gmail as 2nd result but I am quite surprise that the URL is not from Google.com, it is shorten URL. Have you ever seen this one?
Thursday, June 21, 2012
Wednesday, May 30, 2012
How Site Map Beneficial to Search Engine (Information from Google Employee)
I found very great answer from John Mueller that he posted about Site Map in Stackexchange. .
A Sitemap file helps search engines to discover new and updated URLs on your website. In particular, if your website is fairly large, then this can help them to be able to focus on the new & updated content, instead of having to blindly crawl through everything to see if anything has changed. That can result in new content being found much faster, which can be quite noticeable especially if the site is larger or more complex.
- Find the number of indexed URLs for your website:
These statistics are recalculated daily and very accurate. You can find these in the Sitemaps detail page.
- Discover canonicalization issues:
If the numbers there don't match up, that's frequently a sign that you're specifying URLs in the Sitemap file that don't match what we find during our crawling. That's usually a sign that you need to work on canonicalization.
- Help with canonicalization:
When we find multiple URLs on your site that show identical content, we will give any URL that's listed in a Sitemap an extra edge, even if you don't use other canonicalization methods.
- Find badly-indexed parts of your site:
These counts are supplied per Sitemap file, so you can create separate Sitemap files for logical sections of your site, to discover areas where Google isn't indexing as much as you'd like.Additionally, you can use several extensions in Sitemaps files (eg for images, video, News, or internationalization), should you choose to do that. These extensions are all optional.
- Prioritize crawl errors:
In the crawl errors section, URLs that were specified in Sitemaps files are listed separately. Since you specifically supplied these URLs, we assume that you want them indexed, and that any crawl errors there are important.
For most websites, the most visible element of Sitemaps files is that you can see the indexed URL count. It can take a day or so to appear, so if you just submitted a Sitemap for the first time, you may need to be a bit patient. While other ways (eg a site:-query) are very, very rough approximations, this count is extremely acccurate.
Wednesday, April 4, 2012
Brand Hijacking on Google Places?
I always concern about the Google Places Quality. I found another one that made me so disappointed. You try query for "Intercontinental Hua Hin", so surprise that their competitor (Accor) listed #1 on Google search result, if you click the listing, it goes to Ibis Hua Hin Hotel. The address appears on the result is Ibis Hua Hin Hotel address not Intercontinental Hua Hin.
Why Google not use local information (e.g., Address, Telephone) on the official site?
Why Google not use local information (e.g., Address, Telephone) on the official site?
Labels:
Google,
Google places,
Search Quality
Tuesday, February 7, 2012
How Google Test New Search Algorithms?
I read one interesting interview with Amit Singhal - the guy behind Google search algorithms. This following is highlighted what he explained about how Google test new search algorithms.
"We have the entire web in a sandbox that only our engineers can see, and our engineers can take their new algorithm and see it change millions of queries. If it works, we send it to testers, whom we pay, but we don't tell them what they are testing.
If the tweaks are still deemed useful they are unleashed into the wild - but only to some users.
Then we take a tiny slice, one per cent of our users, and expose them to this change. We measure things such as where on the page they click, when they click higher - that's good for us.
That one per cent are not told but it's just an experimental algorithm, and the changes are potentially beneficial, so not hurting the user's experience.
Concurrently we have approximately 100 ideas floating around that people are testing - we test thousands in a year. Last year we ran around 20,000 experiments. Clearly they don't all make it out there but we run the process very scientifically.
Once the tests are done, a report is built by an independent statistician. "We look at that with a group of senior people who come together every week, and we decide if it's good for users, the web ecosystem and for our systems."
One Percent of Google users - Is it means Page Layout Algorithms is still experimental algorithms? Regarding to this blog post, Google said "This algorithmic change noticeably affects less than 1% of searches globally." How much different between number of Google users and number of searches?
Most interesting part is the last sentence : "We decide if it's good for users, the web ecosystem and for our systems." I always hear Google emphasizes on "Users" word when they are talking about search quality.
What do you think about this?
Most interesting part is the last sentence : "We decide if it's good for users, the web ecosystem and for our systems." I always hear Google emphasizes on "Users" word when they are talking about search quality.
What do you think about this?
Subscribe to:
Posts (Atom)