This is default featured slide 1 title

Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation test link ullamco laboris nisi ut aliquip ex ea commodo consequat

This is default featured slide 2 title

Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation test link ullamco laboris nisi ut aliquip ex ea commodo consequat

This is default featured slide 3 title

Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation test link ullamco laboris nisi ut aliquip ex ea commodo consequat

This is default featured slide 4 title

Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation test link ullamco laboris nisi ut aliquip ex ea commodo consequat

This is default featured slide 5 title

Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation test link ullamco laboris nisi ut aliquip ex ea commodo consequat

Thursday, May 31, 2012

How to Use a Press Release for SEO


Press releases are typically developed by the public relations team to get the word out about anything newsworthy, however if crafted the right way, a press release can also provide SEO benefits as well. In order to build links pointing back to a website that will improve search engine trust over time it’s necessary to create and publish content, like press releases, on other web properties. Of course, a press release should never be created solely for SEO purposes. A press release that doesn’t have any newsworthy information will never get picked up and is only a waste of time and money. However, if something newsworthy is going on, you might as well use it to your SEO advantage right? 
Here are 5 important press release for SEO steps:
Choose Keywords
If you are currently executing an SEO campaign, you should know which keywords you are targeting on your website. Choose a few keywords that are most relevant to the press release content to target within the press release.
Implement Keywords into Content
The keywords or keyword phrases being targeted need to be implemented into the press release title and body content in a natural way. Including too many keywords will result in a poor reader experience. It also might get the press release rejected from the distribution service. Be sure to add anchor text links to the keywords being targeted that direct a visitor to a relevant page of your website.
Distribute Through a Paid Service
There are plenty of free services to distribute a press release through but the free services often come with limitations. Links typically aren’t allowed in a free press release and there is little control over when it gets distributed. The best chance of getting a press release picked up on other sites is if you go through a legitimate paid service.

Include as Web Page/Blog Post

To get the most out of a release, write a different version of it and post it on your web property. This helps to build site content (which the search engines like) and ensures that the content lasts and can be easily found in the future by website visitors looking for more information.
Promote via Social Media
Once the press release has been published, promote it via social media channels and encourage others to share it by including social sharing buttons. If the press release content is interesting to your target audience (which it should be!) and gets shared in social media it will help to improve your social signals.

Wednesday, May 30, 2012

How to Write Title Tags for Your Web Pages


When it comes to search engine optimization, the single most important sentence that you will write for your website is the title tag of your main page. If you write it properly then you will have taken a big step towards getting your site well placed in search engine queries for your important keywords.

Before I give you a step-by-step guide to writing title tags, let's define what they actually are and see why they are important. When you look at a web page in your browser, the writing in the blue strip above browser's commands (file, edit, view, etc.) is the title tag. On your actual html document the title tag is in the head portion between the notation <title> and </title>.

The title tag is important because it "tells" the search engine what the page is about, and in the case of your main page, what your website is all about. I remember back in my school days that we used to take standardized examinations in which we had to read a story and then answer the question: "What would be the best title for this essay?" Choosing a title tag is something like answering this kind of question. You've got to pick out the gist of your enterprise and highlight it in a sentence. So, take a look at your web page and get ready to begin, following these steps:

1. Make sure your three or four most important keywords or keyword phrases appear in the title tag. The most important words should appear near the beginning of the sentence, and they can be repeated within the sentence for added emphasis. For example, if I am offering low-cost web design, then my title tag might look like this: <title>Web Design: Affordable, Low Cost Web Design from the Acme Web Design Company</title>

2. Leave your branding and sales pitch for another part of the web page. Although it is a natural tendency to want to put your company name at the beginning of the title tag, you should remember that unless you are very famous like Coca Cola, people are not searching for you. So, put your most important keywords at the front of the title tag, and establish your brand name with your logo and other elements of the web page. If your company name includes your keywords, like our hypothetical Acme Web Design Company, then put it in the title tag, but not necessarily at the beginning.
Similarly, the title tag is no place for your sales pitch, so keep out flowery or extraneous adjectives, unless they are actual terms used in searches for your product or service.

3. Place your geographical or niche-defining term in your title tag. If you are trying for a top ten or top twenty position for a term such as "web design," then you are really in for a difficult struggle. However, suppose the Acme Web Design Company is located in Columbus, Ohio. Then instead of attempting the almost impossible task of getting the top rankings for the term "web design," it would be far better to get a high ranking in the geographical niche using a title tag such "Web Design, Columbus, Ohio: Low Cost Web Design in Columbus, Ohio by The Acme Web Design Company"

4.The title tag can be longer than you think. Some guidelines say that the title tag should be no longer than 70 characters. It is true that only the first 70 characters will show in the top bar of the browser, but search engine robots will read the rest of the tag and the search engines will not penalize you for going over the 70 character mark. Take a look at highly ranked sites in heavily competitive categories and you will see examples of long title tags. Write the tag according to your need to get your important words and phrases included in a sentence that best describes what your product or service is about.

5.Vary the title tags on the inner pages of your website. Even with a long title tag, it is not possible to highlight all the possible terms which someone might use to find your website. This is not a problem if you make use of the other pages of your website. Instead of simply having a title tag that says "services" our web design firm could highlight "low cost, web design services" on that page. The "contact" page could be used to emphasize the geographical location once again, and so on. Many websites make the mistake of repeating the same title tag on each of the inner pages of the site. Avoid this and use each of your page's titles to target important keywords and keyword phrases.

So, take a look at your website's title tags, and see if you can improve them. The effort that you make will be well rewarded.

Tuesday, May 29, 2012

How To Write A Blog, In A Manner That People Want To Read


Anyone who wishes to write a blog can start a blog but not everyone knows how to write a blog people actually want to read. First decide what can bloggers like you do to keep visitors coming back for more again and again after their initial visit? Every effort on your blog will impact your readers from your content to your design and everything in between. Take a look below for more information into how to write a blog, that people want to read.


Select:
First, choose what type of blog you want to create. Carve out a niche and pick a catchy title that captures the essence of your blog. Remember that a blog is just like your clothes, it’s an extension of you. For most people, your blog site may be the only thing in which they identify you and you should be sure that your work is reflected in your blog.


Make a Choice:
Then decide how often you are going to post.
*Some claim that posting at least once every day is best option.
*Also some say that three quick posts a day are far more effective than one long post every three days.
*Yet others still claim that, when they update a blog every other day, they get more readers than when updating two or three entries in a single day.
But you make your choice, of your own that fits you.
Write whenever you want – it’s only the content that matters! Whatever you do, remember that, for most bloggers, it’s all about reading, and many of them would prefer quality over quantity. Once you get started, you’ll find out that you have attracted a certain readership, and you have to adjust, to work out with your journal to appease and keep the readers you’ve obtained.


Share:
Tell your close friends about your blog and ask them to tell their friends too. Often, if you use it as another way to network with people around you, you’ll get a better response. But if you push it too hard, then don’t be surprised if they ignore your blog, because they may feel you’re fishing for compliments and attention. Remember one thing always, blogging is about you, and the more attention you put into yourself, then the more people are going to notice.


Make People Love Your Blog:
Look around the Internet for blogs that people love to read. For example: Read and post to them religiously if they like it. Also leave a note that actually has something to do with their site so that they know you actually that you took the time to pay attention to the material posted.
But do not expect anything back in return. Just commenting will cause others to be more likely to visit your blog and do the same. Often, when you make comments to sites, a link to your own personal site will already be included with your comment, unless you are posting from one hosting site to the next.


Present Yourself with Variety:
Variety is also very important. If you want your blogs to have a variety, instead of just having you &your blog posts all the time. You can have
*Guest bloggers,
*Videos,
*Podcasts ,
*Also you can play around and add some YouTube videos for fun.
These are just a bunch of ways to make your blog seem more exciting to your reader.


Make Valuable Content:
Provide valuable content that will help people. But please don’t come up with content for the sake of getting people to click your link or to buy something from you. If you create content that is informational and helps someone out there, people will automatically share that and it will go viral. That will help you generate more visitors for your blog and eventually more leads. Remember here, Content is King!


Dos & Don’ts:
Below are few dos & don’ts while writing a Blog:


Dos:
*Find your focus.
*Be relatable.
*Be yourself.
*Use links within your posts.
*Include images.
*Respond to blog comments.
*Post to Facebook, Twitter, Google+ & also anywhere else you can post.


Don’ts:
*Don’t set unrealistic goals.
*Have a limited word count.
*Don’t ever post with grammar mistakes.
* Don’t write too long paragraphs.
*Avoid trying new things until you get used to write blogs.
* Don’t create a negative image.


Conclusion:
Keep your content as concise as possible. Don’t beat about the bush. State your point, give an example, and close on it. People don’t want to read posts that are too long, so do your best to keep it short, but not too short.

Friday, May 25, 2012

10 Myths That Scare SEOs But Shouldn't - Whiteboard Friday


This week I want to address some of the myths that form in the SEO world that get people really scared and worried and asking questions in Q&A and on Twitter and on forums going, "Hey, wait a minute. I heard that this is a problem. Is this going to cause something bad with my site?" Let me put these to ease and try to explain each one. We've got ten. Let's get to them.
Number one: I'm worried because I have too many links pointing to my site from one particular domain. Maybe it's a site-wide link. Maybe they just embedded you in their blogroll, and it's linking to you. This isn't a problem unless the links are coming from a highly manipulative source, in which case you'd hope they weren't linking to you anyway. But I wouldn't stress too much about it. I'll get to people pointing bad links to you in a second. If you have 80,000 links pointing to you from one particular site, don't stress. This isn't going to kill your SEO. It's not the end of the world. If there's a good, editorial, natural reason why those links should exist, it's probably going to help you. What it won't do is help you 79,000 times more than if you just had a few pages on there, but it will help. It's not a terrible thing. Don't panic. I would almost never worry about this unless the links are from particularly terrible, spammy pages, in which case you might sort of worry, right? People have been worried particularly with Google's Penguin update that, "Oh, the links that I have might be hurting me." Great. Okay.
If you bought those links and you did it in a manipulative way, you acquired them somehow, fine. Contact those people. Please tell them to take those links down. If other people are just building spammy links to you, do not sweat it.
Sweat earning great editorial links. Great editorial links, a fantastic site, great user experience, tremendously valuable content that people don't want to live without, and building a real brand on the Internet, those things will protect you far better from spammy links than trying to contact webmasters one by one and get them to take down your link profiles. There are cases where you might need to do this if you have done or someone else has done bad linking on your behalf in the past, but these are rare. They're few and far between. I'd worry much, much more about building up a great site.
Number three: My keyword density is too high. I don't know where this concept came from. I know years ago people worried about keyword density as in the percentage of keywords on a particular page that are my target phrase that I'm trying to rank for. That's a good search engine signal, and I should try to make my keyword density 2.78%. No. A) You don't need to worry about that, and (B) you also don't need to worry about too high. There was then this myth that, oh wait, if my keyword is too high a percentage of the content on the page, maybe they won't use it for ranking, but they'll flag it for spam. Years ago Bing did say, "Yes, keyword density, we might look at that as a signal of how we do things." If you're writing content naturally and you've got a great user experience, and it just so happens that you have an e-commerce product page where the title is the name of the product and then the product description contains the title twice, and that's just how it goes and that's natural and it's in the headline, and it happens that, oh no, my keyword density here is 30% or 40%
of the text on the page, don't panic. That's okay. That's a fine thing.
As long as you're doing things naturally, you really never need to worry about keyword density. It's when you're doing manipulative kinds of things and building pages just to rank and stuffing them with keywords, then you might start to get into danger territory. But even then, keyword density is probably not the way to measure it. Measure it by looking at the page and being logical and saying, "Does this look like a great page for users?" If not, "Wait a minute. Is the word on here four times, and I only have ten other words? Oh no." Don't panic.
Number four: Other sites are scraping your site or your blog - your RSS feed is the most common way - and then republishing it elsewhere. Not only should you not panic about this, but I might say you should be a little proud of this. This mean that great, the Internet has discovered you. They've decided your RSS feed is good, useful, and worth copying and reposting. If they're reposting other places, 99% of the time they're also linking back to anything that you link to, including your own site. So having your blog picked up and scraped is just fine. Some of these, yes, they're spammy, manipulative, and junky. Don't worry. Google's not going to hold that against you. It's not your fault. Every site on the Web has this. Literally SEOmoz, I think, is copied by 200 plus different aggregators who all republish our content, maybe more than that. Don't stress. Don't worry about it. What you can do, what you should do, is make sure that those links that you've got are absolute links, so that when they're copied and picked up, they point back to your site. That's a great way to go. But don't panic about this. A lot of these uses are also legitimate.
Number five: What if Google sees my analytics because I'm using Google Analytics, and then they see that my engagement rates are low? I have a high bounce rate, low time on site. Are they going to punish me for low engagement and give me a penalty? No, they are not. Don't panic about this either. Number one, Google has promised that the Google Webspam Team and Search Quality Team do not get data directly from Google Analytics. In the aggregate, they might be using it to inform some things, but they are not looking at your site's analytics and saying, "Oh, let's punish that guy. Let's punish him for having low engagement, low time on site." They might see that people are bouncing off your page and back to the search results and being unhappy and those kinds of things. But if you're delivering a good user experience, if you're delivering a great answer to simple questions, your bounce rate is going to be high, and your engagement and time on site is going to be low because you've answered the user's query very quickly. Think of Q&A sites that are essentially answering dumb, simple questions like: What year Franklin Roosevelt was born? Oh, good, it was this year. Good, I'm out of here I'm done. You're gone. Don't worry about this low engagement, low usage. And don't worry about Google seeing into your analytics. They're not going to penalize you for it.
Number six: If this link is reciprocal, meaning I link to this site and they link back to me, will I get penalized for it? Does it lose its value?
Should I not link to the places that are linking to me? What if the New York Times links to me? I want to share that article with all my readers and say, "Oh, look, the New York Times covered me." But I don't want to make it a reciprocal link. Stop worrying. This is not a big concern. You don't need to worry about reciprocal links from this perspective. Years ago, there was this practice, and it still exists a little bit, where people would create pages and pages of links. They'd all point to their friends who they found on the Web. Their friends would all point back to them, and reciprocal links became a bad word because it was a spammy tactic that the engines had a pretty easy time identifying. But if you're just sharing the stuff that's sharing you, this is a fine thing to do. Don't panic. Don't worry that just because you're linking to something, the link back won't count.
Number seven: I'm linking with non-ideal anchor text. Is this going to hurt me? I have this page and I want to point to it internally or externally with a link, and I wanted it to contain this anchor text, but it's not as user-friendly and I'm worried people won't click through on it, or it seems a little manipulative, or I just can't get my product team to buy into that. It's okay. Don't panic. Don't worry about that either. In fact, there's a lot of suspicion in the SEO space right now that Google is looking at exact match anchor text and saying, "This stuff is not natural. This isn't normal. Why are people linking like this?" If you have an opportunity where it fits well with user experience, fits well with the content, and the anchor text makes sense, great. Fantastic. Take that opportunity. Earn that link. But don't stress if many of your links are pointing with a brand. This is again part of that density myth, where people think, oh, wait a minute. If 100 links point to me but 50 of them don't have my anchor text, then I won't rank for that. This is not a problem. You're going to be just fine. Don't stress.
Number eight: There are links in my footer. I have a footer on my website. I've got links in there. Are those going to negatively affect me? I've heard lots of bad things about footer links. Most of the time, this is not a problem. Again, it goes back to the same thing that we've been talking about throughout this Whiteboard Friday, which is if you're doing it for good user experience. If we take a look at one of my favorite footers, which is on Zappos.com. They have a great footer. It's long, it's lengthy. It almost feels too long, but it has fun stuff in there. It makes me like the company even more. It links to a lot of good things. Great, no problem. However, if you're stuffing tons and tons of links and you've got a footer that, oh here's an exact match anchor text; there's another exact match anchor text; there's another exact match anchor text; and I've got a big old list of them, and it goes all the way down my footer, you start to look like you're manipulating the search results. We've actually seen people who've pulled these or made their footers look more natural and more user-
experience centered, the penalties will actually be lifted. So it looks like Google algorithmically penalizes people for tons of stuffing and bad keywords in the footer. But just because it's in the footer doesn't necessarily mean it's bad. Don't stress just because of this word footer and footer links.
Number nine: Will URLs without keywords prevent me from ranking well? I don't know where this myth came from, but there's like this world of, "Oh, look, it's /123 or /?ide=7 instead of /keyword which I wanted to rank for."
This is not a tremendous problem. Certainly if you can get to the point where your URLs are keyword friendly and they're static, that's good. That's best practices. You want to make it so that when someone reads your URLs offline or sees them in an email or a tweet, they go, "Oh, I bet I can guess at exactly what's on that page," and that's a wonderful thing. Yes, when people copy and paste those URLs, the keywords will be in there. That's nice. But this is not going to prevent you from ranking. You see tons of pages that rank very well that do this. I would not stress about this. I wouldn't necessarily jump through tons of hoops to have all your URLs rewritten. It can be a big engineering effort. Sometimes it pays off. When you're doing a site redesign anyway, go for it. But I wouldn't make that the centerpiece of your SEO campaign. Oftentimes, this is not going to move the needle as much as you think it will.
Number ten, our last one: What about link bait? I'm worried about link bait and content marketing efforts and building this great content stuff, having a blog, having infographics, and having these cool videos, because they're not my product pages or sales pages. Won't Google eventually penalize for this because they don't want to see people just engaging in producing great content and earning links to their site? No. Google and Bing have both stated very specifically that they love this practice of content marketing, of doing great stuff on the Internet, even if it's only partially or semi-relevant to your particular niche or industry or customers. This is like saying, "Hey, I have a business that hosts a bunch of events. I have a business that donates to charity. I have a business that is one of the best employers in the state." It is interesting and does cool stuff outside of our pure product and sales process. That is a good thing. That is a great way to earn branding and awareness and attention. It's a great way to do well in social media and earn a following there. It's a great way to have content that's spread throughout the Web. It will help with SEO because of the rising tide phenomenon, which is essentially your site is this ship sailing on the ocean, and as the tide rises from all the links that are pointing into you, essentially your domain's link juice rises and authority rises, all the pages on there will perform slightly better. Google is not going to take away this power and essentially say,
"Oh, you know what? We're only going to count links to the exact page and we're only going to count them exactly this. We don't want this concept of domain authority." They love the concept of domain authority because they love the world of brands and branding. I would not stress that your content marketing and link bait efforts are going to be penalized or devalued. In fact, I would continue to focus on them. And if you can find ways to make the audience overlap well with what the people are actually buying, that's even more fantastic.
I hope you've enjoyed this Whiteboard Friday. I want you to de-stress, stop worrying about some of these myths that I know are popping up all over the place. Stop being scared of words like footer links and footers and URLs without keywords and keyword density. Just because these words are out there, just because they're causing problems for some people who are doing things in a spammy, manipulative way, doesn't mean every SEO needs to stress about them.
All right, everyone, I hope you've enjoyed this Whiteboard Friday. We'll see you again next week. Take care. 

Wednesday, May 16, 2012

Five Ways to Generate More Traffic to Your Blog


It takes a lot of effort to operate a business blog. It’s certainly not easy, which is why many businesses end up failing at their blogging attempt. One of the most common complaints is that the time spent doesn’t seem to show any return. “Why should I commit to blogging if nobody is reading my posts?” It takes a long time to build a loyal blog following, sometimes it can take years. The key is to stick with it and do what you can to improve traffic to your blog. Here are 5 tactics:

Optimize Blog Posts:-
Blog posts can rank in the search engines for specific keywords and keyword phrases. Incorporate keywords into the title and content of each blog post to get noticed by the search engine spiders. Use a keyword research tool to find out how people are searching for information about your industry and write posts that address these topics. Keywords are important, but don’t go overboard. Always keep the visitor experience as the top priority.

Build Links: - 
When conducting an SEO link building campaign, build links to your blog just as you would build links to your homepage or other important internal pages. Submit the blog to blog directories and include anchor text links that point back to the blog when publishing content like articles and press releases online.


Build up a Social Media Presence:-
There are many benefits to social media. Operating an active social media presence is a way for target audience members to learn more about your business. Share links to blog content with social followers. If they find it beneficial, they will share it with their followers which will build your links and improve visibility.


Write Guest Posts:-
Once you have become a trusted source of information you may have the opportunity to contribute content to other blogs. While you may think that it’s hard enough writing content for your own blog, this opportunity shouldn’t be passed up. It’s a great way to market your blog and get your content in front of a new audience. If they like what they see, they will click over to your blog to learn more.


Leverage Newsletter: - 
Email newsletters are still an effective marketing tool. People that have opted in are obviously interested in your brand. Include links to blog post content in every newsletter that is sent out.

It's Time to Write Content That Panda Loves


When we talk about Google Panda content, there is no defined elaboration of the term. Although Penguin update is now on the platform to throw more realistic aspects, Panda still enjoys its relevance in the domain with quality and original contents. The article illustrates the ways in which an Internet marketing company professionals can develop panda specific contents.
Be Original, Unique and New:
This is probably the most important aspect that a content writer should think while writing web contents or articles for the business websites. Your readers are very special to you so you must give them an opportunity to read something good and original. When you wrap up a topic in the content, cover it in full details and offer final insight to the audience. You may add more creativity to the articles by adding videos, slide-shows and images if possible.
Highlight The Portions for Visibility:
Every content has some striking points that may attract readers instantly. Bold or highlight them. You should also highlight those content elements that are important to the text. It will give a big impact to your article if you highlight keywords within the content. This will not only help search engine bots to index your article efficiently but also offers an opportunity to readers to go across the important points of the content.
Target Your Potential Audience Only:
Every website has a special business motive to accomplish. You have to target only those customers who are relevant to your business domain. Internet marketing experts therefore write their contents by keeping the requirements of the potential readers in the mind. All you have to offer quality articles to your readers without losing the core motive of paragraph optimization.
Add Heading Tags:
Proper headings and headlines play a significant role in the article. For the main title of the article, highlight the titles with the appropriate labels. Putting descriptive and illustrative titles may serve more information to the user. Include the keywords in the headings in very meaningful manner and use them only if need arises.
Strict 'No' To Plagiarism:
Copied contents have no meaning in the contents and they serve almost no purpose for the author and reader as well. You are therefore advised not to copy content from competitor or other sites in the domain. Search engines don't like copied contents and hate to see duplicate content in their index.
Include Social Media Gadgets:
A professional writer from the SEO content writing company can also take advantage of the social media platforms like Facebook and Twitter. There is no better way if you want to spread your article among masses.

Sunday, May 13, 2012

Tips to Increase Ranking and Website Traffic


It is worth cataloguing the basic principles to be enforced to increase website traffic and search engine rankings.
 
•    Create a site with valuable content, products or services.
•    Place primary and secondary keywords within the first 25 words in your page content and spread them evenly throughout the document.
•    Research and use the right keywords/phrases to attract your target customers.
•    Use your keywords in the right fields and references within your web page. Like Title, META tags, Headers, etc.
•    Keep your site design simple so that your customers can navigate easily between web pages, find what they want and buy products and services.
•    Submit your web pages i.e. every web page and not just the home page, to the most popular search engines and directory services. Hire someone to do so, if required. Be sure this is a manual submission. Do not engage an automated submission service.
•    Keep track of changes in search engine algorithms and processes and accordingly modify your web pages so your search engine ranking remains high. Use online tools and utilities to keep track of how your
 website is doing.
•    Monitor your competitors and the top ranked websites to see what they are doing right in the way of design, navigation, content, keywords, etc. 
•    Use reports and logs from your web hosting company to see where your traffic is coming from. Analyze your visitor location and their incoming sources whether search engines or links from other sites and the keywords they used to find you.
•    Make your customer visit easy and give them plenty of ways to remember you in the form of newsletters, free reports, reduction coupons etc.
•    Demonstrate your industry and product or service expertise by writing and submitting articles for your website or for article banks so you are perceived as an expert in your field.
•    When selling products online, use simple payment and shipment methods to make your customer’s experience fast and easy.
•    When not sure, hire professionals. Though it may seem costly, but it is a lot less expensive than spending your money on a website which no one visits.
•    Don’t look at your website as a static brochure. Treat it as a dynamic, ever-changing sales tool and location, just like your real store to which your customers with the same seriousness.

Thursday, May 10, 2012

How to Improve Your Click Through Rate (CTR)


What is CTR

CTR is the percentage of viewers that actually click on your ad. CTR is different from impressions, which is the number of people who view your ad, whether they click through or not.
By improving your CTR with AdWords, you achieve the following:
  1. Increase your traffic.
  2. Lower your CPC (thanks to the Rank = CPC * CTR equation).
The click through rate (CTR) is crucial for your success. You should always try to improve it.

What Is a Good CTR

A good CTR is certainly above the 0.05% rate that AdWords asks you to maintain in order to keep your ad running for each keyword. If your CTR gets lower than that, your ad is disabled after it receives a thousand impressions. Certainly, 0.05% is not enough. But it’s hard to tell what is good and what is bad CTR beyond that.
CTR depends on the following:

Keyword popularity

Suppose your CTR for a keyword like “shoes” is 1.5 %. Is that good or bad?Usually, a general keyword like that is also very popular. These kinds of words generate a lot of impressions, but the percent of people that actually click on ads is low. In this circumstances, your 1.5% CTR is good. If you choose “repair jogging shoes” instead, you’ll probably get fewer impressions on this one, and fewer clicks, but still, a much better click through rate. This second keyword is not so popular, but it is much more targeted.When comparing two similar keywords — a general and a specific one, you should see a big difference in the CTR they get as well as in their conversion rates. A keyword that has both low popularity and low CTR is actually low value.

Your competition

To follow the same example, we expect a popular keyword like “shoes” to have a tough competition. This means higher CPC , and fewer chances for you to get clicks. But it’s very possible that few people are bidding on “repair jogging shoes” — if any — and in this case you have all the viewers of your ad to yourself.
There’s no low, high or average CTR. You have to take into account the circumstances mentioned above. And beyond that, consider the quality of your ad, your bids and your daily budget, because these too can influence your CTR.

How Keywords Strategies Influence CTR

General keywords generate a lot of impressions but few click-throughs and even fewer conversions, because they can’t really filter the potential customers. Specific keywords on the other hand are more focused, therefore they have more power to select potential clients.
Ways to use keywords for increasing your CTR:
  • Avoid using very general keywords (shoes).
  • Focus on using targeted keywords which describe your actual products or service in a rather specific way (repair jogging shoes).
  • Collect all related keywords in targeted Ad Groups.
  • Create ads that are highly targeted to the keywords in that Ad Group.
There are two other very important areas to take care of if you want high click through rates: your ad copy and your ad position.

Wednesday, May 9, 2012

Google Panda Update vs. Google Penguin Updates


The SEO community has been a buzz this past week with the latest update from Google, named Penguin. Penguin came down the pipeline last week, right on the tail of the latest Panda update. Since most of the big updates in the past year have been focused on Panda, many site owners are left wondering what the real differences between Panda and Penguin are. Here is a breakdown:

Google Panda Update Overview:

According to Google’s official blog post when Panda launched,
This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.
Basically, Panda updates are designed to target pages that aren’t necessarily spam but aren’t great quality. This was the first ever penalty that went after “thin content,” and the sites that were hit hardest by the first Panda update were content farms (hence why it was originally called the Farmer update), where users could publish dozens of low-quality, keyword stuffed articles that offered little to no real value for the reader. Many publishers would submit the same article to a bunch of these content farms just to get extra links.

Panda is a site wide penalty, which means that if “enough” (no specific number) pages of your site were flagged for having thin content, your entire site could be penalized. Panda was also intended to stop scrappers (sites that would republish other company’s content) from outranking the original author’s content.
Here is a breakdown of all the Panda updates and their release dates. If your site’s traffic took a major hit around one of these times there is a good chance it was flagged by Panda
1. Panda 1.0 (aka the Farmer Update) on February 24th 2011
2. Panda 2.0 on April 11th 2011. (Panda impacts all English speaking countries)
3. Panda 2.1 on May 9th 2011 or so
4. Panda 2.2 on June 18th 2011 or so.
5. Panda 2.3 on around July 22nd 2011.
6. Panda 2.4 in August 2011(Panda goes international)
7. Panda 2.5 on September 28th 2011
8. Panda 2.5.1 on October 9th 2011
9. Panda 2.5.2 on October 13th 201110. Panda 2.5.3 on October 19/20th 2011
11. Panda 3.1 on November 18th 201112. Panda 3.2 on about January 15th 2012
13. Panda 3.3 on about February 26th 2012
14. Panda 3.4 on March 23rd 2012
15. Panda 3.5 on April 19th 2012
Search Engine Land recently created this great Google Panda update infographic to help walk site owners through the many versions of the Google Panda updates.
Many site owners complained that even after they made changes to their sites in order to be more “Panda friendly,” their sites didn’t automatically recover. Panda updates do not happen at regular intervals, and Google doesn’t re-index every site each time, so some site owners were forced to deal with low traffic for several months until Google got around to re-crawling their website and taking note of any positive changes.

Google Penguin Update Overview:

The Google Penguin Update launched on April 24. According to the Google blog, Penguin is an “important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.” Google mentions that typical black hat SEO tactics like keyword stuffing (long considered webspam) would get a site in trouble, but less obvious tactics (link incorporating irrelevant outgoing links into a page of content) would also cause Penguin to flag your site. Says Google,
Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.
Site owners should be sure to check their Google Webmaster accounts for any messages from Google warning about your past spam activity and a potential penalty. Google says that Penguin has impacted about 3.1% of queries (compared to Panda 1.0’s 12%). If you saw major traffic losses between April 24th and April 25th, chances are Penguin is the culprit, even though Panda 3.5 came out around the same time.
Unfortunately, Google has yet to outline exactly what signals Penguin is picking up on, so many site owners that were negatively impacted are in the dark as to where they want wrong with their onsite SEO. Many in the SEO community have speculated that some contributing factors to Penguin might be things like:
1. Aggressive exact-match anchor text
2. Overuse of exact-match domains
3. Low-quality article marketing & blog spam
4. Keyword stuffing in internal/outbound links
It’s important to remember that Panda is an algorithm update, not a manual penalty. A reconsideration request to Google won’t make much a difference–you’ll have to repair your site and wait for a refresh before your site will recover.  As always do not panic if you are seeing a down turn in traffic, in the past when there is a major Google update like this things often rebound.  If you do think you have some sort of SEO penalty as a result of either the Google Panda or Google Penguin updates, please contact your SEO service provider to help or start trouble shooting.

Three Underused Features of Google Analytics


With so much data available through Google Analytics, sometimes it's easy to fall into the trap of always looking at the same things – reviewing keyword referrals or landing pages, or simply looking at traffic trends.
However, a wealth of available data is often overlooked. Sometimes reviewing something different, or slicing the data in a new way, is all it takes to find a great new opportunity or diagnose a long-standing problem.
Here are three top features in Google Analytics for giving a fresh look to data you may have looked at too many times before.

Advanced Segments

Advanced segments are a godsend for any type of analytics review or diagnosis. By default, you can look at traffic for organic search (or paid depending on the client focus) vs. all traffic to understand the trends and traffic changes you're seeing for the work you're doing against the overall traffic to the client's site.
Segmenting only new traffic is also a good way to help understand if a site is providing the relevant information in the right places to take brand new visitors to a conversion.
I have also recently started using custom segments to identify and exclude visitors who log in (set up a new custom segment to exclude any visitor who lands on the thank you for logging in page of your site). This is a useful alternative to a custom segment to help understand the behavior of new and returning visitors who have never signed up to, or purchased from, a site.

Audience Behavior, Frequency, Recency & Engagement

Frequency and recency refer to how often, and at what time intervals, a visitor returns to the site. Engagement metrics look at how long users are spending on site and how many pages they look at.
While these reports are somewhat broad brush, they make great starting points for understanding what direction you should take in your analytics research.
For instance, looking at the frequency of visits to a hotel site might show that there is a high return rate within 3 days, and a second peak at 50-100 days. This suggests that a large number of visitors are taking a three day period to review their hotel options before making a booking, and that they will generally wait 2-3 months before looking to book again.
When we combine this data with an advanced segment of only converting visitors, the data becomes even more stark. This data can be used to influence retargeting and email campaigns to increase conversion and return visitor rate.
Looking more closely at engagement, we can choose to review visit duration and page depth. I generally expect to see a peak for 0-10s visit duration and then a second peak for much longer duration periods, but the side of these peaks can be very telling about the relevancy of the information being provided to visitors.
Understanding the number of pages that converting visitors look at (again using this feature in conjunction with advanced segments) you should see a minimum number of page views required due to the conversion or shopping cart process. However, if you see a small number of page views above this number resulting in high conversions, you might be prompted to look at sending more traffic to these pages, or to understand the topic of these pages and focusing additional efforts there.

Multi-Channel Funnels

Multi-Channel funnels were launched in August last year, and provide extremely interesting data on the paths that customers take before converting on your site. This is extremely useful information for understanding the true value of a referral source over and above the last touch conversions that it drives.
This is especially useful for low conversion referral channels, and can provide valuable insights into true value. You can see first touch conversion data through the reports in this section and also start to understand how many visits it takes for customer to reach conversion.
The downside to this report is that the data isn't very granular (you can see for instance that a customer clicked through paid search, organic search and eventually converted through a direct visit, but you can't see the keywords involved). Despite this, it's a huge step toward better understanding of customer behavior and the true value each channel provides.

Google Penguin Update: 5 Types of Link Issues Harming Some Affected Websites


Are you angry and looking for answers about why your rankings vanished after Google released its Penguin update? One common factor thus far appears to be the signals of links that are pointing to your website, early analysis indicates.
The main purpose of the Penguin update is to put a deep freeze on web spam in Google's search results. By extension, a big piece of that web spam appears to be links from low-quality networks.

Natural Links

Before we get into the new findings, first it’s important to understand a bit about Google and links.
Above all, Google considers links as editorial "votes". So, theoretically, the sites that receive the most votes should rank higher on Google because more people find them valuable.
Google analyzes the quantity, quality, and relevance of websites that link to yours. When Google looks at your link profile, they’re looking at such things as what types of websites link to yours, how quickly you acquired these links, and the anchor text (the clickable words) used by the linking website. When Google's algorithm detects such things as a large number of new links or an imbalance in the anchor text, it raises a big red flag.
As Google and many SEOs have preached for years, you’ll attract more links by creating unique, worthwhile content that others will want to link to naturally. If you want to learn more about Google, links, and link building, definitely read our posts “Why Links Matter”, "Filthy Linking Rich", and “Introduction to Google PageRank: Myths & Facts”.

Unnatural Links

For companies that have been hit by the Penguin update, one common theme appears to be a severe lack of natural links, according to a blog post by Glenn Gabe at G-Squared Interactive. He noted five common issues these sites are all facing:
  1. Paid text links using exact match anchor text: For companies that want to rank for a certain term (such as “red widgets”) one way to accomplish this is by buying links from other websites with that exact matching anchor text. This is against Google’s guidelines, as Google would consider this a paid link that exists solely to manipulate PageRank, rather than to provide any value to visitors.
  2. Comment spam: Two things proved problematic for websites trying to unnaturally rank for specific keywords: signatures in comments that contained exact match anchor text; and people who used a spammy user name (e.g., Best India SEO Company) as exact match text.
  3. Guest posts on questionable sites: Although guest posts are a legitimate way to earn links to your site, sites dinged by the Penguin had links pointing to their website from sites filled with low-quality articles where the focus was on the anchor text rather than the content.
  4. Article marketing sites: Thin content featuring links with exact match anchor text were another common factor among affected sites.
  5. Links from dangerous sites: Do you have inbound links from sites that have been flagged for malware, numerous pop-ups, or other spammy issues? This was another factor that caused websites to lose their Google rankings, so links to and from web spammers or “bad neighborhoods” are a danger.
Ultimately, the Penguin update didn’t really change anything that Google has deemed unacceptable. Google has just evolved its algorithm to catch up to those who try to loophole their way to higher Google rankings (and, to be fair, some who simply don't know any better or fully understand SEO). If any (or all) of the above are your sole link building tactic(s), you probably aren't doing enough to rank prominently long-term on Google anymore.
For those unfamiliar, Google has a section devoted to link schemes and makes no secret that such practices “can negatively impact your site's ranking in search results.”

Penguin Recovery?

So, fix all these link issues, eliminate any instances of keyword stuffing, spun content, cloaking, and other spammy tactics and you're guaranteed a Penguin recovery, right? Not necessarily. There are never any magical guarantees for gaining or regaining top search rankings and Google is notoriously tight-lipped about the exact signals it uses to detect web spam.
Additionally, Google is constantly making tweaks to its search algorithm. So check your traffic in analytics and make sure your traffic indeed was impacted starting on or after April 24. If your traffic vanished before this date, another change might be to blame – there was also a parked domain classifier issue the week prior to Penguin's launch in addition to the latest Panda refresh on April 19.
Regardless, with the new tag team of Panda and Penguin, Google can put the smack down on websites that appear to be creating or supporting spam to increase their rankings in search engines. So even if you fix all these link signals, you still must make sure you have quality content.
But even beyond that, there are hundreds of other factors at play that Google's algorithm looks at. Among them:

Life after Penguin:-

While it’s much easier to blame Google and sign a petition begging Google to kill its Penguin update, this isn't the time to give up. Now is the time to look at your website, do a proper, careful evaluation of your inbound link profile, clean up your website, and devise a smarter marketing and business strategy that doesn't rely on Google for the majority of your traffic and income so you can escape the endless loop of Google algorithm updates:



This isn’t to say Google or any search engine results are perfect – though now might be a good time to check out alternatives like Google's closest competitor, Bing, or upstarts Blekko and DuckDuckGo. Google has created a Penguin feedback form for those who feel websites have been hit unfairly, but this update is algorithmic as opposed to a manual penalty (i.e., reviewed by a human), so don’t expect to see whatever rankings you’ve lost miraculously restored over night.
If you’re a small business, there are ways to Google-proof your marketing. And don't forget to look for non-Google-based link opportunities.
But above all, sometimes when these algorithmic changes roll out, one of the wisest moves is to be patient and carefully analyze any changes before you react blindly to the latest penalty – because by the time you do that, Google will release the latest Panda or its next iteration of Penguin, and you'll be trapped again in the endless loop of relying solely on a third party (Google) for your livelihood.

Tuesday, May 8, 2012

Google - Tips 'n' tricks, submission, listing and ranking


Google seems to be rapidly becoming the most popular search engine. Submitting, getting listed and getting a high ranking in Google can get you a lot of traffic, usually even more than Yahoo. Google uses link popularity while ranking web sites which results in quality search results therefore making it a favorite among most web searchers.
  •  

Submitting to Google
You need to only submit your homepage to Google. It will automatically index the entire site. Don't submit individual pages of your site.
Tip: Google's crawler Googlebot will follow all the links in your site. So make sure all your pages are linked otherwise some of your pages will not get indexed.
Submit to Google here
  • Getting Listed in Google
    Getting listed in Google is usually very fast. We got listed within 2 weeks and all our new pages are usually listed within a month. The time frame depends on their crawl schedule and how many other web sites are in queue to be indexed before ours. In our experience, Google usually lists a site within a month.

    Tip: Make sure your web site is ready before you submit it to be listed. Google will index your entire site, content and all. It takes keywords from the metatags as well as the content. You'll find that Google will send you the maximum amount of traffic (in our case it is triple of what Yahoo send us - and Yahoo is supposed to be the #1 Search Engine), so time well spent on your web site will pay off in the long run.

    Tip: Google usually indexes sites during the 2nd week and lists new content during the first few days of the next month. We usually see our new content added on the 2nd or 3rd of the month. A safe bet is to wait for a month to see new pages listed.
  • Ranking in Google
    Getting listed in Google is pretty easy but what really matters is your ranking. The best way to get a high ranking in Google is to have many sites linking to your site. As Google uses link popularity as its most important factor in ranking web sites, getting many sites to link to you is your best bet in getting a high ranking. Get quality links and increased visitor traffic with only minutes of submission efforts!

    Tip: Try to develop quality content which is different from what other sites are offering and you're guaranteed to get a high ranking in Google. Think of specialized keywords instead of using generic terms. Using generic keywords would usually land you in the 30th or 40th page whereas using specialized, or a combination of keywords would get you in the 1st or 2nd page of search results. Think of writing articles that others have not written and you're sure to come up in the first 10 search results. E.g If we were to write an article on web design we would probably come up in the 100th page of search results, whereas if we were to write on 'Creating Swap Images in Fireworks' we would probably come up in the first 10 search results.
Google Tips 'n Tricks
  1. Make sure all your important keywords should appear in your title, description, content and alt tags of your web page. This increases your keyword density and helps in boosting your ranking.
  2. Write articles, give away freebies and make your site a quality site with fresh content. Sites will automatically link to you thus increasing your link popularity.
  3. Here's a cool trick to find out if you are listed in Google. Type this URL in your browser
    http://www.google.com/search?
    q=YOURDOMAIN+site:WWW.YOURDOMAIN.COM
    Google will return to you a complete list of all pages that lie on yourdomain.com that exist within the Google catalog.
    Example
    http://www.google.com/search?q=entheosweb+site:www.entheosweb.com
  4. You can also find out how many sites that are listed in Google are linking to your site. Here's the trick. Type this URL in the browser and watch the results.
    http://www.google.com/search?as_lq=YOURDOMAIN.COM
    Google will return to you the number of pages which link to your web site in addition to a listing of each URL.
    Example:
    http://www.google.com/search?as_lq=entheosweb.com

Monday, May 7, 2012

Create a dynamic relationship with your readers!


A dynamic relationship with your readers will build trust in you and your platform, deepen your audience's participation, increase your conversion rates, and more.

One of the methods of creating dynamic relationships is opening a line of communication between you and your audience. A communication strategy you can use with your readers is the power of suggestion - Topic Suggestions that is!

What are Topic Suggestions?

Your readers can suggest article topics they wish you to write on. This is done by selecting "Suggest a topic" found:

    In the Get Involved section below your articles.
    In the upper-right section of your Expert Author Bio page.

This will open a suggestion box where your readers can suggest article topics or headlines they would like you to address.

Once the reader has submitted a topic suggestion, our team will review the suggestion to filter out any spam or junk so you don't have to. In doing so, you will have substantial topic suggestions relevant to your niche and your readers.

How to Access Your Topic Suggestions

   - Log into your My.EzineArticles.com account
   -Select the Write & Edit tab
   -Select Topic Suggestions from the left navigation menu

From your Topic Suggestions queue, you can view article suggestions your readers or visitors would like you to write about. You can choose to hide suggestions by selecting the "X" to the right of the suggestion. To view your hidden suggestions again, select "Delete" from the "Filter by" dropdown menu.

Encourage your readers to suggest a topic today! In doing so, you can create a dynamic relationship with your readers by writing highly relevant, quality articles specifically targeting their needs and wants.

See if you have any new Topic Suggestions by logging into your My.EzineArticles.com account today!

Visit this post online to share your comments or suggestions.

Friday, May 4, 2012

Google Algorithm Changes for April: Big List Released


As expected, Google has finally released its big list of algorithm changes for the month of April. It’s been an interesting month, to say the least, with not only the Penguin update, but a couple of Panda updates sprinkled in. There’s not a whole lot about either of those on this list, however, which is really a testament to just how many things Google is always doing to change its algorithm – signals (some of them, at least) which could help or hurt you in other ways besides the hugely publicized updates.

We’ll certainly be digging a bit more into some of these in forthcoming articles. At a quick glance, I noticed a few more freshness-related tweaks. Google has also expanded its index base by 15%, which is interesting. As far as Penguin goes, Google does mention: “Keyword stuffing classifier improvement. [project codename "Spam"] We have classifiers designed to detect when a website is keyword stuffing. This change made the keyword stuffing classifier better.”

Keyword stuffing is against Google’s quality guidelines, and was one of the specific things Matt Cutts mentioned in his announcement of the update.

Interestingly, unlike previous lists, there is no mention of Panda whatsoever on this list, though there were 2 known Panda data refreshes during April.
Here’s the list in its entirety:
  • Categorize paginated documents. [launch codename "Xirtam3", project codename "CategorizePaginatedDocuments"] Sometimes, search results can be dominated by documents from a paginated series. This change helps surface more diverse results in such cases.
  • More language-relevant navigational results. [launch codename "Raquel"] For navigational searches when the user types in a web address, such as [bol.com], we generally try to rank that web address at the top. However, this isn’t always the best answer. For example, bol.com is a Dutch page, but many users are actually searching in Portuguese and are looking for the Brazilian email service, http://www.bol.uol.com.br/. This change takes into account language to help return the most relevant navigational results.
  • Country identification for webpages. [launch codename "sudoku"] Location is an important signal we use to surface content more relevant to a particular country. For a while we’ve had systems designed to detect when a website, subdomain, or directory is relevant to a set of countries. This change extends the granularity of those systems to the page level for sites that host user generated content, meaning that some pages on a particular site can be considered relevant to France, while others might be considered relevant to Spain.
  • Anchors bug fix. [launch codename "Organochloride", project codename "Anchors"] This change fixed a bug related to our handling of anchors.
  • More domain diversity. [launch codename "Horde", project codename "Domain Crowding"] Sometimes search returns too many results from the same domain. This change helps surface content from a more diverse set of domains.
  • More local sites from organizations. [project codename "ImpOrgMap2"] This change makes it more likely you’ll find an organization website from your country (e.g. mexico.cnn.com for Mexico rather than cnn.com).
  • Improvements to local navigational searches. [launch codename "onebar-l"] For searches that include location terms, e.g. [dunston mint seattle] or [Vaso Azzurro Restaurant 94043], we are more likely to rank the local navigational homepages in the top position, even in cases where the navigational page does not mention the location.
  • Improvements to how search terms are scored in ranking. [launch codename "Bi02sw41"] One of the most fundamental signals used in search is whether and how your search terms appear on the pages you’re searching. This change improves the way those terms are scored.
  • Disable salience in snippets. [launch codename "DSS", project codename "Snippets"] This change updates our system for generating snippets to keep it consistent with other infrastructure improvements. It also simplifies and increases consistency in the snippet generation process.
  • More text from the beginning of the page in snippets. [launch codename "solar", project codename "Snippets"] This change makes it more likely we’ll show text from the beginning of a page in snippets when that text is particularly relevant.
  • Smoother ranking changes for fresh results. [launch codename "sep", project codename "Freshness"] We want to help you find the freshest results, particularly for searches with important new web content, such as breaking news topics. We try to promote content that appears to be fresh. This change applies a more granular classifier, leading to more nuanced changes in ranking based on freshness.
  • Improvement in a freshness signal. [launch codename "citron", project codename "Freshness"] This change is a minor improvement to one of the freshness signals which helps to better identify fresh documents.
  • No freshness boost for low-quality content. [launch codename “NoRot”, project codename “Freshness”] We have modified a classifier we use to promote fresh content to exclude fresh content identified as particularly low-quality.
  • Tweak to trigger behavior for Instant Previews. This change narrows the trigger area for Instant Previews so that you won’t see a preview until you hover and pause over the icon to the right of each search result. In the past the feature would trigger if you moused into a larger button area.
  • Sunrise and sunset search feature internationalization. [project codename "sunrise-i18n"] We’ve internationalized the sunrise and sunset search feature to 33 new languages, so now you can more easily plan an evening jog before dusk or set your alarm clock to watch the sunrise with a friend.
  • Improvements to currency conversion search feature in Turkish. [launch codename "kur", project codename "kur"] We launched improvements to the currency conversion search feature in Turkish. Try searching for [dolar kuru], [euro ne kadar], or [avro kaç para].
  • Improvements to news clustering for Serbian. [launch codename "serbian-5"] For news results, we generally try to cluster articles about the same story into groups. This change improves clustering in Serbian by better grouping articles written in Cyrillic and Latin. We also improved our use of “stemming” — a technique that relies on the “stem” or root of a word.
  • Better query interpretation. This launch helps us better interpret the likely intention of your search query as suggested by your last few searches.
  • News universal results serving improvements. [launch codename "inhale"] This change streamlines the serving of news results on Google by shifting to a more unified system architecture.
  • UI improvements for breaking news topics. [launch codename "Smoothie", project codename "Smoothie"] We’ve improved the user interface for news results when you’re searching for a breaking news topic. You’ll often see a large image thumbnail alongside two fresh news results.
  • More comprehensive predictions for local queries. [project codename "Autocomplete"] This change improves the comprehensiveness of autocomplete predictions by expanding coverage for long-tail U.S. local search queries such as addresses or small businesses.
  • Improvements to triggering of public data search feature. [launch codename "Plunge_Local", project codename "DIVE"] This launch improves triggering for the public data search feature, broadening the range of queries that will return helpful population and unemployment data.
  • Adding Japanese and Korean to error page classifier. [launch codename "maniac4jars", project codename "Soft404"] We have signals designed to detect crypto 404 pages (also known as “soft 404s”), pages that return valid text to a browser, but the text only contains error messages, such as “Page not found.” It’s rare that a user will be looking for such a page, so it’s important we be able to detect them. This change extends a particular classifier to Japanese and Korean.
  • More efficient generation of alternative titles. [launch codename "HalfMarathon"] We use a variety of signals to generate titles in search results. This change makes the process more efficient, saving tremendous CPU resources without degrading quality.
  • More concise and/or informative titles. [launch codename "kebmo"] We look at a number of factors when deciding what to show for the title of a search result. This change means you’ll find more informative titles and/or more concise titles with the same information.
  • Fewer bad spell corrections internationally. [launch codename "Potage", project codename "Spelling"] When you search for [mango tea], we don’t want to show spelling predictions like “Did you mean ‘mint tea’?” We have algorithms designed to prevent these “bad spell corrections” and this change internationalizes one of those algorithms.
  • More spelling corrections globally and in more languages. [launch codename "pita", project codename "Autocomplete"] Sometimes autocomplete will correct your spelling before you’ve finished typing. We’ve been offering advanced spelling corrections in English, and recently we extended the comprehensiveness of this feature to cover more than 60 languages.
  • More spell corrections for long queries. [launch codename "caterpillar_new", project codename "Spelling"] We rolled out a change making it more likely that your query will get a spell correction even if it’s longer than ten terms. You can watch uncut footage of when we decided to launch this from our past blog post.
  • More comprehensive triggering of “showing results for” goes international. [launch codename "ifprdym", project codename "Spelling"] In some cases when you’ve misspelled a search, say [pnumatic], the results you find will actually be results for the corrected query, “pneumatic.” In the past, we haven’t always provided the explicit user interface to say, “Showing results for pneumatic” and the option to “Search instead for pnumatic.” We recently started showing the explicit “Showing results for” interface more often in these cases in English, and now we’re expanding that to new languages.
  • “Did you mean” suppression goes international. [launch codename "idymsup", project codename "Spelling"] Sometimes the “Did you mean?” spelling feature predicts spelling corrections that are accurate, but wouldn’t actually be helpful if clicked. For example, the results for the predicted correction of your search may be nearly identical to the results for your original search. In these cases, inviting you to refine your search isn’t helpful. This change first checks a spell prediction to see if it’s useful before presenting it to the user. This algorithm was already rolled out in English, but now we’ve expanded to new languages.
  • Spelling model refresh and quality improvements. We’ve refreshed spelling models and launched quality improvements in 27 languages.
  • Fewer autocomplete predictions leading to low-quality results. [launch codename "Queens5", project codename "Autocomplete"] We’ve rolled out a change designed to show fewer autocomplete predictions leading to low-quality results.
  • Improvements to SafeSearch for videos and images. [project codename "SafeSearch"] We’ve made improvements to our SafeSearch signals in videos and images mode, making it less likely you’ll see adult content when you aren’t looking for it.
  • Improved SafeSearch models. [launch codename "Squeezie", project codename "SafeSearch"] This change improves our classifier used to categorize pages for SafeSearch in 40+ languages.
  • Improvements to SafeSearch signals in Russian. [project codename "SafeSearch"] This change makes it less likely that you’ll see adult content in Russian when you aren’t looking for it.
  • Increase base index size by 15%. [project codename "Indexing"] The base search index is our main index for serving search results and every query that comes into Google is matched against this index. This change increases the number of documents served by that index by 15%. *Note: We’re constantly tuning the size of our different indexes and changes may not always appear in these blog posts.
  • New index tier. [launch codename "cantina", project codename "Indexing"] We keep our index in “tiers” where different documents are indexed at different rates depending on how relevant they are likely to be to users. This month we introduced an additional indexing tier to support continued comprehensiveness in search results.
  • Backend improvements in serving. [launch codename "Hedges", project codename "Benson"] We’ve rolled out some improvements to our serving systems making them less computationally expensive and massively simplifying code.
  • “Sub-sitelinks” in expanded sitelinks. [launch codename "thanksgiving"] This improvement digs deeper into megasitelinks by showing sub-sitelinks instead of the normal snippet.
  • Better ranking of expanded sitelinks. [project codename "Megasitelinks"] This change improves the ranking of megasitelinks by providing a minimum score for the sitelink based on a score for the same URL used in general ranking.
  • Sitelinks data refresh. [launch codename "Saralee-76"] Sitelinks (the links that appear beneath some search results and link deeper into the site) are generated in part by an offline process that analyzes site structure and other data to determine the most relevant links to show users. We’ve recently updated the data through our offline process. These updates happen frequently (on the order of weeks).
  • Less snippet duplication in expanded sitelinks. [project codename "Megasitelinks"] We’ve adopted a new technique to reduce duplication in the snippets of expanded sitelinks.
  • Movie showtimes search feature for mobile in China, Korea and Japan. We’ve expanded our movie showtimes feature for mobile to China, Korea and Japan.
  • No freshness boost for low quality sites. [launch codename “NoRot”, project codename “Freshness”] We’ve modified a classifier we use to promote fresh content to exclude sites identified as particularly low-quality.
  • MLB search feature. [launch codename "BallFour", project codename "Live Results"] As the MLB season began, we rolled out a new MLB search feature. Try searching for [sf giants score] or [mlb scores].
  • Spanish football (La Liga) search feature. This feature provides scores and information about teams playing in La Liga. Try searching for [barcelona fc] or [la liga].
  • Formula 1 racing search feature. [launch codename "CheckeredFlag"] This month we introduced a new search feature to help you find Formula 1 leaderboards and results. Try searching [formula 1] or [mark webber].
  • Tweaks to NHL search feature. We’ve improved the NHL search feature so it’s more likely to appear when relevant. Try searching for [nhl scores] or [capitals score].
  • Keyword stuffing classifier improvement. [project codename "Spam"] We have classifiers designed to detect when a website is keyword stuffing. This change made the keyword stuffing classifier better.
  • More authoritative results. We’ve tweaked a signal we use to surface more authoritative content.
  • Better HTML5 resource caching for mobile. We’ve improved caching of different components of the search results page, dramatically reducing latency in a number of cases.