Tuesday, September 17, 2013

SEO Web Design Methodology

Web use has increased by leaps and bounds since the early days of the Internet. Search engines are busily indexing the most obscure corners of the Web and online commerce is booming. Whether the business is driving leads to a physical store or the conversion happens on the website itself, consumers have come to expect that anything they could want or need can be found online. Based on consumer expectations, then how do you go about establishing a internet marketing presence and getting yourself highly ranked in the search engines? The very best way is to build a Web site that includes search engine optimization (SEO) best practice principles as well as keeping in mind usability, audience and niche.


It used to be that companies didn’t have to worry too much about Web site design. Just throwing the information on the page was ‘good enough’. That’s not the case today. In today’s world, the search engines and a few million Internet users scour the Web daily looking for fresh and interesting content. The design of your site will influence how crawl-able it is for the search engines and how easy it is for users to navigate through. If the engines get bogged down on your homepage or if users can’t find what they want and get turned off, your site will lose its chance at being competitive in search engine marketing.


SEO Web Design Methodology


Step 0 – Architecture/Design Planning


Before you begin to build your website, you need to decide what it is you want your site to accomplish. You must establish what your internet marketing strategy will be. There are two basic kinds of users on the web, those interested in researching a topic and shoppers who have finished their research and are ready to act. Your site should be built to satisfy whichever of these types of searchers you are looking to attract. Determine before you begin if you are going to be selling something or if you are going to be providing information. Knowing your niche is critical to the success of your search engine optimization design project.


Know who your audience will be and who you want to target. Researchers have different needs than shoppers and it can be tricky to target both. Your target audience will determine your site focus.


Creating a web persona is a good place for you to start in figuring out who your audience is. We have a great article about creating a web persona and recommend you read it. Here are the highlights:


  • Understand (and keep in mind) your target audiences’ goals and beliefs

  • Develop the most effective voice for your company

  • Determine what products/features will and will not be accepted by your audience

  • Get to know your audience on a more personal level

  • Build a shared vocabulary between you and your audience to avoid confusion

  • Enable your company to make informed decisions

An interesting tool and useful tool that can help you make sure your business is customer-centric not “you-centric” is the We We Calculator. It checks your content and gives you a score that compares how much you talk about yourself to how much you talk about your customers and meeting their needs.


Step 1 – Site Analysis/Site Assessment


Once you have determined what your site is going to be about and who it’s going to target, you need to analyze your competition. How are they accomplishing their goals, can you improve on what they are doing, and can you offer something more? What is going to set you apart from them, and more importantly, put you above them?


The first step to take when identifying your competition is to determine what keywords are being targeted for your industry in the search engine marketing space. Bruce Clay has developed a whole methodology and set of SEO tools for analyzing your competition and choosing keywords. We recommend that you read our search engine optimization overview to really get a handle on targeting your competition. An abbreviated list of steps for keyword analysis that you will need to take is as follows:


  1. Identify Your Competition

  2. Use Link Tracking

  3. Identify Keywords

 


Step 2 – SEO Design


Once you have planned and analyzed your site, there are two facets of SEO friendly design that you need to consider. First, think about your users. How are they going to navigate through and use your site? You want to make it easy to understand and easy to get around. What types of colors and typography will you use to help them navigate and want to delve deeper into your site? What types of navigation will you use: drop down menus, left-hand or right-hand navigation sidebars, etc? Navigation is vital to your site. You want your visitors to be able to get around easily and keep them on your site for as long as possible, therefore you need to ensure that they can navigate with ease.


The second thing you need to think about is the search engines. While it is the job of your search engine optimization specialist to make the most of your web pages there are several things that you should consider during the SEO design phase:


  • What type of architecture are you going to use for your site?

  • How will the navigation put in place for your users affect navigation for the Search Engines?

  • What do the search engines want to see and what will they reward vs. penalize?

  • Do you already have content that you can use or will you have to have it written?

  • What about Cascading Style Sheets (CSS), JavaScript, and being W3C compliant?

Besides using straight HTML for coding your web site, there are other options that you can use for building your web site. Three of the most popular are Flash, Ajax, and PHP. Like all technologies there may be benefits and drawbacks to implementing them but they are definitely worth looking into.


Flash has become a popular tool in internet marketing and is now widely used across the web, including our own site. However, there can be drawbacks to it, so you need to carefully determine whether or not your site will benefit from its use. One of the benefits of using Flash is the visual appeal of it. You can use it with your navigation (as we do), add interesting visual graphics, and even make a game out of it.


One disadvantage you may discover about using Flash though is that not everyone has the Flash player and not everyone may want to download it. This is where evaluating who your audience is becomes very important. If you are catering to a crowd who is technologically savvy, then you are probably ok. However, if your audience is older and didn’t grow up with the internet and personal computers, then you might want to use a more conservative approach in the technologies that you use on your site. Also, keep in mind that pages using Flash may not index well and you may end up losing rankings if you are using Flash on the pages that you want to be indexed.


Ajax is shorthand for Asynchronous Java Script and XML. It is a web development technique for creating interactive web applications. The term Ajax is relatively new but the technologies behind it were developed in the 1990s. Ajax is rapidly gaining popularity in the programming community because it can make the exchange of data behind the scenes faster, thus ensuring that your pages load faster. This is one of the most important components to keeping people on, and coming back to, your site. To learn more about Ajax, read this essay by Jesse James Garrett who came up with the acronym.


PHP is a general purpose scripting language embedded within your HTML code. Along with other uses, it also allows web developers to create dynamic web content to interact with databases. In order to use PHP you need to make sure that your server supports PHP and that it is enabled. Contact your systems administrator if you are not sure. Keep in mind that PHP cannot create web pages; it is mainly used for server-side scripting. PHP.net is a great resource to consult to find out more about PHP and how it can enhance your web site. One advantage to using PHP is that your visitors will not need to download any special applications to access the pages on your site that use PHP. However, one disadvantage to using PHP is that it does require some basic programming skills to implement it into your web site. However, it is not difficult and you do not need to be a programmer to understand it, learn it and ultimately write it. You just need to have available time to sit down and learn it.


Step 3 – Web Design Implementation


After you have architected the site, you need to decide who is going to build the templates and write the content. Do you have an in-house design/web development team or will you hire an outside firm? What about search engine marketing – can you do your own or will you have to hire this out as well? Remember to follow our Quality Site Criteria guidelines. These are absolutely vital in ensuring that your site has the proper ‘curb appeal.’


Adding keywords to your content is an important search engine optimization component for each page that you submit to the search engines for spidering. So where should you add them and how, and how will it affect your design?


It has been shown that pages in which their keywords (and keyword phrases) are listed prominently in the META TITLE and META DESCRIPTION tags consistently rank higher than those that do not. You must also be sure that you use those keywords and phrases throughout the page content. We think that linking pages together using the keywords of the landing page in the anchor text of the sending page is a must for good SEO. “…use text links within paragraphs when possible, especially when the pages are related. If the topics are not related, then use image links so the search engines do not see the text and get confused.”


You must be careful about how many images you use on a page. While images may help the page look pretty, you must remember that some people turn off images so that pages load faster, while others may be using screen readers, voice recognition, or speech synthesizers because they are visually impaired (which can include color blindness and dyslexia, not just low or no vision).


Step 4 – Website Testing


Testing your site is important. You need to perform some case studies. Get some people (coworkers not involved in working on the web site, family, friends, etc.) who will go through every inch of your site and let you know where they get bogged down, and where they get confused or lost. Use their comments to fix the problems. Then have them go through the site again. After that, get new people to go through the site again to find things that may have been missed the first two times.


Step 5 – Site Maintenance


Once your site has been launched and submitted to the search engines you need to decide, if you haven’t already, who will be responsible for updating and maintaining your site. If you used an in-house team, then most likely they will be able to continue with the maintenance of the site. You also need to figure out who’s going to keep an eye on your competitors and continue to SEO your site. If you used an outside firm for this, can you afford to continue with them or should you look into taking SEO classes yourself or offering an SEO training course to your Web design team?


You are going to need to continue to monitor the search engine rankings and based on the movement of you site, you may need to tune your keyword list. This is where the job of the designer most likely will end and the job of the search engine marketing expert or SEO will begin. However, we will discuss it a bit here.


Tuning your keyword list involves using the search engine that has the fastest indexing service so that you can quickly determine which keywords and phrases are working the best and which are not. Again, make sure that your keywords are in all the META tags (title, description and keywords). It is important to include them in your image ALT attributes as well, and try to use them in the opening lines of the page so that the theme of the page is known right from the start.



SEO Web Design Methodology

Wednesday, August 7, 2013

The Inconvenient Truth About SEO

Do you own a website? Do you want to be number one on Google? Whatever you do, don’t spend money on aggressive search engine optimization (SEO). I know that sounds like an extreme position to take. However, a lot of website owners see search engine optimization as the answer to their search ranking woes, when things are considerably more complex.


The inconvenient truth is that the best person to improve your ranking is you. Unfortunately, that is going to take time and commitment on your part. The answer doesn’t lie in hiring a SEO company to boost your website ranking for Google. The problem starts with the term “search engine optimization” and the misconceptions surrounding it.


What SEO Isn’t


Most website owners perceive SEO as a dark art, shrouded in mystery. They have heard phrases like “gateway pages” and “keyword density” or have been bamboozled by technobabble about the way websites should be built. All of this has left them feeling that SEO is the purview of experts. This is a misconception reinforced by certain segments of the SEO community.


The problem is that these kinds of complex techniques do work, to a point. It is possible to improve placement through a manipulation of the system. However, although it can have short term benefits, it will not last without continual investment. This is because the objective is wrong. SEO shouldn’t be about getting to the top of Google for particular phrases. In fact, we shouldn’t be optimizing for search engines at all. We should be optimizing for people. After all, that is what Google is trying to do.


Why You Shouldn’t Be Optimizing For Search Engines


Google’s aim is simple: connect its searchers with the most relevant content. If you are more worried about a good ranking than providing relevant content, then you are going to be fighting a losing battle.


If you hire a SEO company to improve your placement and you measure their worth on the basis of how high they get you in the rankings, then you are out of line with what Google is trying to achieve. Your primary objective should be better content, not higher rankings.


Original, valuable content.
Image credit: Search Engine People Blog.


The SEO company can use every trick in the book to get you better rankings, but over the long term they will lose, because Google is constantly changing how it rates websites so it can provide more accurate results.


Remember, you shouldn’t be optimizing for ranking in search engines, you should be optimizing for users.


A Better Way


Google does not make a secret of how to gain a high ranking. It states clearly in itswebmaster guidelines:


“Make pages primarily for users, not for search engines.”


So how do you actually do that? Again Google provides the answer:


“Create a useful, information-rich website, and write pages that clearly and accurately describe your content.”


In short, write useful content. This could include (but is not limited to):


  • Publishing white papers,

  • Writing a blog,

  • Sharing research findings,

  • Producing detailed case studies,

  • Encouraging user-generated content,

  • Creating useful applications or tools,

  • Running a Q&A section,

  • Posting interviews

The list could go on. The key is to produce content people find useful and want to share.


Yes, there are some technical considerations when it comes to search engines. However, any reasonably well-built website will be accessible to Google. You don’t need an expert SEO company for that (at least not if the Web designer does their job right).


As an aside, it is worth noting that if you take accessibility seriously for users with disabilities (such as those with visual impairments), then you will also make a website accessible to Google.


However, setting those technical issues aside, it all comes down to content. If you create great content, people will link to it, and Google will improve your placement. It really is that simple.


The question then becomes, how do you create great content?


The Inconvenient Truth


This is the point where we come to the inconvenient truth. It is hard for an outside contractor to produce the great content that will keep users coming back and encourage them to share. In my experience, this is much better done internally within the organization. The problem is that this doesn’t sit well with most organizations. Its easier to outsource the problem to a SEO company than to tackle an unfamiliar area internally.


Admittedly, a good SEO company will have copywriters on board who can write content for you. However, their knowledge will be limited, as will their ability to really get to know your business. Yes, they can write a few keyword-heavy blog posts that Google will like the look of. However, this won’t fool users, and so the number of links to that content will be low.


The truth is that if you are serious about improving your placement on search engines, it has to be done internally.


This truth is all the more painful, as most organizations are not configured to do this properly.


Organizational Change Required


The more I work with organizations on their digital strategy, the more I realize how few are structured to do business in a digital world. The issue of SEO is an ideal example of the problem.


Responsibility for the website normally lies with the marketing department. Although marketing is well-experienced in producing and writing marketing copy that outlines the products and services the organization provides, they are not best equipped to write content that will be heavily linked to.


It is not surprising that if you search on a term like “call to action,” the top results are almost exclusively informational articles, rather than companies helping with services in this area.


The problem is that marketeers are experts in the product or service being sold, not necessarily the surrounding subject matter. For example, the marketing department of a company selling healthy meals will know everything about the benefits of their product, but will have a limited knowledge of nutrition. Unfortunately, people are more likely to link to a post on healthy eating tips than they are to link to some marketing copy on a particular health product.


What you really need is the nutritional expert who designed the meal to be posting regularly to a blog, talking about what makes a healthy diet. A blog like this would include lots of linkable content, would be able to build a regular readership and would produce keyword-rich copy.


The problem is that this is not how organizations are set up. It is not the nutritional expert’s job to write blog posts; that responsibility belongs in marketing.


The Long-Term Solution


Ultimately organizations need to change so that online marketing is a more distributed role with everybody taking responsibility for aspects of it. I am not suggesting that the central marketing function has no role in digital, but rather recognizing that they cannot do it alone. Others will need to have some marketing responsibilities as part of their role.


For example a company selling healthy meals should allocate one afternoon each week for their nutritional experts and chefs to share their expertise online. It would become the marketing department’s responsibility to support these bloggers by providing training, editorial support and technical advice.


Unfortunately, these experts are often the most valuable resource within a business, and so their time is incredibly valuable. The idea of “distracting” them from their core role is too much for many companies to swallow.


However, in the short term there is still much that can be done.


A Short-Term Solution


As we wait for companies to wake up and change the way they are organized, there are ways of working within the system.


If you haven’t already, consider hiring an employee dedicated to creating content for your website. You can partially finance it with the money you save by getting rid of your SEO company.


If that is beyond your budget, consider hiring a short-term contractor or a part-time staff member. You could even use an existing member of your staff as long as they have time set aside to prevent the Web being pushed down the priority list. Although this person won’t have the knowledge to write all the content themselves, by being situated inside of the business it will be much easier for them to get access to those within the organization who do.


Arrange meetings with these experts and talk to them about their role. Identify various subjects based on their knowledge and then either record a video interview or write up a blog post based on what they share. Also ask these experts what news sources they read or which people within the industry they follow. Monitor these sources and ask your expert to comment on what is shared. These comments can be turned into posts that add to the wealth of content on your website.


Finally, you may find that the experts within the business are already producing a wealth of content that can act as source material for content that users will find interesting.


For example, our fictional nutritional expert probably already has documentation on the health benefits of certain food types or how certain conditions can be helped through healthy eating. Admittedly this kind of material might be too dry or academic, but with some editing and rewriting it would probably make great online content.


The content you post does not have to be long, it just has to be link-worthy. The key is toshare the opinion of your expert and provide content of value to your audience.


As that audience grows, start asking questions. Maybe even get some of your readers to share their experiences or knowledge. Over time you will discover that not only will your readers want to contribute, so will your experts. As they see the value in posting content regularly to the website, they will start blogging themselves. All you will have to do is suggest topics and edit their output.


I know what you are thinking: it just isn’t that simple.


No More Excuses


I realize this is a big cultural shift for many organizations. Marketing teams will feel they are losing control, the person responsible for blogging will feel out of their depth and the experts may resent being asked lots of questions. However, what is the alternative?


For better or worse, Google demands good content in return for high rankings. Pretending that SEO companies can magically find a shortcut that allows you to avoid this tradeoff just isn’t going to cut it.


If you care about how you rank, it is time to take responsibility for your website’s content. Once you overcome the initial hurdle, you will find that producing quality content on an ongoing basis becomes second nature.


Update (17.12.2012)


After a heated discussion in comments to this article, in social channels and via Skype, Paul clarified his position in the article How I See The Role of SEO in his blog. We are republishing the article for the sake of making his arguments clear and unambiguous — of course, with Paul’s permission.—Ed.


There seems to be the perception that I want to see an end to the SEO sector. Although I have issues with the name, I do believe they have a role.


Last week I once again expressed my concerns about website owner’s obsession with SEO in a post for Smashing Magazine.


My message can be boiled down to the following points:


  • Website owners are unhealthily obsessed with their rankings on Google.

  • We should be creating primarily for people and not search engines.

  • The best way to improve your ranking is to produce great content that people link to.

  • That great content is better produced in-house, rather than being outsourced to an agency.

  • A good web designer can take you a long way in making your site accessible to search engines.

  • Before you spend money on an SEO company, make sure you have the basics in place first.

An Unfortunate Response


Unfortunately this caused a massive and aggressive reaction in the SEO community. Smashing Magazine was attacked for publishing the post, I was told I was out-of-date and ill informed (which is of course entirely possible), but worst of all there were a shocking number of attacks on me personally.


To be honest this doesn’t entirely surprise me. I have been working with the web long enough to be all too aware of the over reaction it creates in people. However, it is always hurtful when somebody attacks you as a human being, rather than your opinion.


Of course not everybody was like that. I had great conversations with Bill Slawski and Joost De Valk, both of who attempted to put me straight personally and on their blogs. I very much appreciate them taking the time and they have helped to soften my views.


SEO Companies Do Have A Role


I think it is important to stress that I do believe SEO companies have a role. The problem is they are often brought in when there is still much work that could be done internally within the organisation.


To me its about return on investment. Why spend money improving your search engine rankings when you could spend the same money improving rankings and producing more engaging content? Or why not spend money on improving your rankings and building a more accessible website?


There are two exceptions to that general rule of thumb.


CONTENT STRATEGY


First, the SEO industry is changing. They are increasingly helping clients with content and that is great. However, if that is the role they are going to take then they need to stop saying they are about “search engine optimisation.” Creating great content is not primarily an SEO job. They have a branding issue there.


Also, although I am happy for an SEO company to help educate clients about content they shouldn’t be writing copy for them week and week out for them. Take the approach of a content strategist who trains up the client, provides them a strategy and then encourages them to take on the role themselves. Isn’t that better for the client?


CLEANING UP AFTER BAD WEB DESIGNERS


The second exception is where the web designer has built an inaccessible website. As Joost De Valk said in his response to my post, it falls to the SEO company to clean up the mess.


This is obviously an issue that needs addressing in the web development community and why we need people like Joost speaking at web design conferences.


However, I wouldn’t expect a web developer to provide all of the technical subtleties of an SEO company. That is probably too specialist for most web designers to do.


I don’t doubt that these subtleties are important and do make a difference to rankings. However, once again it is important that we have the basics in place first:


  • Great content.

  • A solidly built website.

Setting The Right Priorities


Hopefully that helps clarify my position slightly. I am not for a minute trying to destroy the SEO sector (as I was accused of repeatedly). What I am trying to do is set priorities straight.


I guess in short it is the phase “search engine optimisation” I have a problem with. It implies we should be accommodating the idiosyncrasies of search engines above the needs of users.


That is something I will never compromise over and I am sure something the vast majority of SEO companies would agree with.


(cp)


Source: http://www.smashingmagazine.com/



The Inconvenient Truth About SEO

Friday, August 2, 2013

Thursday, May 30, 2013

SEO Bee Tree | "HillTop Algorithm"

Source : seoisrael.com
Category : SEO Bee Tree

At the end of 2003, the Google search engine started using a new algorithm for ranking its search results - the Hilltop Algorithm. Opinions are varied regarding the exact time when this new algorithm was put to use (whether during the Florida Google Dance or before), but there is a general agreement that it is already in use

The Meaning of the Hilltop Algorithm

SEO Bee TreeThe Hilltop algorithm was created by Krishna Baharat from California between the years 1999-2000. On January 2001, he patented a certain algorithm in partnership with Google. The patent is different from the one presented in the original article, and it is not clear which of the two options is the one that is actually used. Today, Krishna works for Google. Although Google never officially announced that they are using one of these algorithms, it seems that they couldn't overlook their advantages.

If in the past Google was interested in the number and the quality of incoming links to a certain page, and not in the question where these links came from, today the game has changed significantly.

There is a new webpage status called Expert Documents. In case an expert document links to a webpage, this serves as a "vote of confidence" for the target page, and as a result this page's rank increases in the search results. There will be different expert documents for different search words, when the major difference between Krishna's article and the registered patent is in the way expert documents are selected.

A page that didn't receive a link (a vote of confidence) from at least two expert documents, will not receive a score from the algorithm at all! Although it is possible that it will still appear in search results because of other factors (e.g.: PageRank and factors of the page itself), but the webpage's location will be clearly damaged.

The key difference between the original article and the registered patent is in the way an expert document is selected.

Pre-defined expert document list

If you read the article written by Krishna Baharat (you can read the original Hilltop article here), it looks like the search that is based on the new algorithm consists of three steps:

Locating expert documents: Constructing a pre-defined expert documents list from available webpages. This list is general and relatively permanent. Next, linked sites are removed from the list (see below)

Correlating expert documents with the search query: When a search is executed, it goes through the expert document list, thus creating a sub-list from the extensive expert documents list for the same topic.

Assigning a LocalScore to the search: A LocalScore is given to any page that comes up in the search according to the anchor text of incoming links from pages that are included in the expert document list that was created in the last step. If the page has less than two links from expert documents there, the page would receive no score whatsoever from the algorithm.
SEO Bee Tree
Hilltop algorithm

The definition of expert documents is pages on a certain topic that were especially created in order to direct visitors to information regarding the chosen topic, meaning that they contain links to other sites that are not affiliated with them. Examples of this kind of pages are: index pages, institutional websites (.org, .gov) and university websites (.edu). The meaning of this fact is that it's not enough to receive links from commercial sites as before. You should receive links from authoritative pages, which are naturally harder to come by.

The importance of correct registration in indexes and other expert websites has increased greatly, while the importance of links from "simple" sites and pages decreased. An additional result is a greater importance of the anchor text of the links that appear on expert documents that link to your site.

Search and ReRank

If you look at the patent that Krishna Baharat registered in cooperation with Google (that can be called the Hilltop Patent), you can see that the search consists of three steps according to the new algorithm:

Initial search: Performing an initial search on all search words (similar to the original pre-algorithm search).

Filtering affiliated pages: Removing affiliated pages (see below).

Assigning a LocalScore: Assigning a LocalScore to pages on the list according to incoming links from pages that are also found on the list. The basic assumption is that the listed pages are the most relevant to this search, therefore only links from these pages will be counted.

Affiliated Pages

A clean up of the expert pages includes removing pages that share the same domain (domain.com, domain.co.il, two.domain.com, www.domain.com) or pages that are located in the same IP group (the first three numbers of the IP number are identical: e.g.: - 212.125.23.XXX).

What about marginal search terms?

Depends on which Hilltop version you believe in. If you refer to the second version, it looks like all it takes is the creation of an initial list that is big enough (after page filtering). If there are not enough websites, the algorithm will not be activated.

If you believe in the first version, then if the search does not appear in the popular search terms for which expert documents were prepared (or found in real time), Google goes back to using the old search method (meaning, without Hilltop).

How do you deal with Hilltop?

First, keep doing what you were doing so far - optimizing your site and building links for PR improvement. The significance of these factors may have decreased, but they are still very significant.

The key to dealing with Hilltop is recognizing the websites and webpages that are defined as experts in your website's field. How do you find them? Depends on which version you believe in. If we are dealing with a pre-defined list, then DMOZ will be one of the experts, and so will Yahoo!. Other indexes may appear as experts, but are not necessarily so.

According to the second version, you need to perform a search with certain keywords, and try to get links from the webpages that appear in the search results. In addition, you should check the incoming links to pages that appear at the top of the search results, and try to get the same links.

Tuesday, May 28, 2013

SEO Bee Tree | "How to Recover from Penguin 2.0"

Source :  searchengineguide.com
Category : SEO Bee Tree

For the approximate 2.3% regular US-English queries that it affects, many SEOs have been waiting to see if the traffic to their website will be negatively affected.
Recover from Penquin

While many black-hat practitioners that were affected by Penguin 1.0 can only hope their rankings don't plunge further into the depths of obscurity, the rest of us are crossing our fingers and hoping that we made it through relatively unscathed.

However, if your website was affected by Penguin 2.0, there are a few things you can do to ensure that your website gets back on the straight and narrow. And remember: the majority of positive comebacks after Google algorithm updates are from websites genuinely wanting to provide the best possible website for users instead of those just looking to skirt by until the next Google Zoo animal is released.

If you were affected by Penguin 2.0, read on. I've done my best to guess what Google has changed, based on information from Matt Cutts prior to the launch of Penguin 2.0, as well as what Google aimed to solve with Penguin 1.0. As the SEO community performs tests and learns more information about specifics of Penguin 2.0, the collective knowledge of the update may change. For now, if you've been hit and are looking for likely answers, here's what you need to do to recover your rankings.

1. Learn From Your Mistakes

The previous Google algorithm updates have already told us what Google is looking for-- great content, natural link profiles (both inbound and outbound), and an organic, natural link velocity that steadily increases month by month. If you were penalized by previous Panda or Penguin updates, then it's time to make the changes necessary to get back into Google's good graces.

If Panda and Penguin 1.0 were slaps on the hand, then consider 2.0 the wake-up you need to get your website completely back on track. The next updates will be even more intuitive and stringent, so making those big content-focused changes now can prevent even further lost traffic in the future.

2. Stop Shady Practices

A crucial step toward recovering from Penguin 2.0 is to stop all "shady" practices as determined by Google. This includes unnatural link building, and spammy or keyword-stuffed onsite content.

Besides looking for these types of black hat practices, Google is getting better at recognizing when sites are trying to be deceitful, especially when it comes to focusing on typically-spammy search queries, catching sites that are participating in link swapping schemes, and content that contains keyword stuffing or unnatural links. The easiest step toward recovering from Penguin 2.0 is to stop these sorts of practices all together.

3. Pick up The Pieces

The bridge between destruction and normalcy is the recovery phrase, which may take weeks or months, depending on the website and level of black hat SEO practices that Google believed they were involved in.

If you believe that your website was penalized because of shady link building practices (which is likely if you were hit by Penguin 2.0), you'll need to identify what links could be causing you harm, and then attempt to have them removed. Any links you fail to get removed, be sure to disavow.

To identify what links could be causing you harm, use a tool like Open Site Explorer or Majestic SEO. Alternatively, have a professional perform a link profile audit to identify harmful links for you. Here's a step-by-step walkthrough I wrote that describes how to audit your own link profile.

You can disavow links through Google Webmaster Tools. This tool should be used with caution and only after personally reaching out to these websites to get the links removed. I recommend disavowing all harmful links (even ones that have been successfully removed); it can't hurt.

Besides attempting to clean up your external link profile, you should also do a comprehensive audit of all pages on your website and make a list of what needs to be done internally. Work on publishing only high-quality content that provides value to the reader and includes only natural anchor text and useful internal linking. Make sure you are correctly set up for Google Authorship, and never use any keyword stuffing or unnatural language that may make Google think you are attempting to make your content appear to be something it's not.

4. Put Best Practices in Place

Once you begin to clean up the damage, it's time to set a plan for moving forward. This can include a recovery plan, depending on how extensive your website is, which may include targeted goals for rewriting all website content or the steady removal of any lingering inbound links acquired via black-hat methods.

After these goals are set, the emphasis should be on strictly implementing best practices for all future websites and content creation. All content and website links should:

  •     Use natural language
  •     Use natural anchor text, and only when it provides value
  •     Focus on providing the highest quality content
  •     Create value for humans instead of the search engines
  •     Contain NO black hat SEO practices

The SEO team should work closely with the content developers to flip the focus and message back onto providing quality over quantity, as well as a natural variety across content platforms. For instance, twenty poorly written articles about buying dresses online for an e-commerce clothing website aren't going to fair as well as four well-written, seasonal guide blog posts, an e-book on style or fashion, and a robust social media profile all about fashion and style.

For many SEOs who worked in the era where anything goes, these algorithm updates are an unsettling change that has been difficult to learn. However, when it comes to SEO, websites should take on an "adapt or die" mentality, as search engines are becoming more intuitive as to both what a user is searching for online and the quality of the websites that want to provide it. 

Monday, May 27, 2013

SEO Bee Tree | "Google Penguin 2.0 Update is Live"

Source : http://searchenginewatch.com
Category : SEO Bee Tree

Webmasters have been watching for Penguin 2.0 to hit the Google search results since Google's Distinguished Engineer Matt Cutts first announced that there would be the next generation of Penguin in March. Cutts officially announced that Penguin 2.0 is rolling out late Wednesday afternoon on "This Week in Google".

"It's gonna have a pretty big impact on web spam," Cutts said on the show. "It's a brand new generation of algorithms. The previous iteration of Penguin would essentinally only look at the home page of a site. The newer generation of Penguin goes much deeper and has a really big impact in certain small areas."

In a new blog post, Cutts added more details on Penguin 2.0, saying that the rollout is now complete and affects 2.3 percent of English-U.S. queries, and that it affects non-English queries as well. Cutts wrote:

We started rolling out the next generation of the Penguin webspam algorithm this afternoon (May 22, 2013), and the rollout is now complete. About 2.3% of English-US queries are affected to the degree that a regular user might notice. The change has also finished rolling out for other languages world-wide. The scope of Penguin varies by language, e.g. languages with more webspam will see more impact.

This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh), we’ve been referring to this change as Penguin 2.0 internally. For more information on what SEOs should expect in the coming months, see the video that we recently released.

Webmasters first got a hint that the next generation of Penguin was imminent when back on May 10 Cutts said on Twitter, “we do expect to roll out Penguin 2.0 (next generation of Penguin) sometime in the next few weeks though.”

Matt Cutts Tweets About Google Penguin

Then in a Google Webmaster Help video, Cutts went into more detail on what Penguin 2.0 would bring, along with what new changes webmasters can expect over the coming months with regards to Google search results.

He detailed that the new Penguin was specifically going to target black hat spam, but would be a significantly larger impact on spam than the original Penguin and subsequent Penguin updates have had.

Google's initial Penguin update originally rolled out in April 2012, and was followed by two data refreshes of the algorithm last year – in May and October.

Twitter is full of people commenting on the new Penguin 2.0, and there should be more information in the coming hours and days as webmasters compare SERPs that have been affected and what kinds of spam specifically got targeted by this new update.

Friday, May 24, 2013

SEO Bee Tree l "How Search Engine Operate Beginners Guide"

Source : seomoz.org
Category : SEO Bee Tree

Search engines have two major functions - crawling & building an index, and providing answers by calculating relevancy & serving results.

Imagine the World Wide Web as a network of stops in a big city subway system.

Each stop is its own unique document (usually a web page, but sometimes a PDF, JPG or other file). The search engines need a way to “crawl” the entire city and find all the stops along the way, so they use the best path available – links.

SEO Bee Tree
 

 “The link structure of the web serves to bind all of the pages together.”

Through links, search engines’ automated robots, called “crawlers,” or “spiders” can reach the many billions of interconnected documents.
Once the engines find these pages, they next decipher the code from them and store selected pieces in massive hard drives, to be recalled later when needed for a search query. To accomplish the monumental task of holding billions of pages that can be accessed in a fraction of a second, the search engines have constructed datacenters all over the world.

These monstrous storage facilities hold thousands of machines processing large quantities of information. After all, when a person performs a search at any of the major engines, they demand results instantaneously – even a 1 or 2 second delay can cause dissatisfaction, so the engines work hard to provide answers as fast as possible.

Search engines are answer machines. When a person looks for something online, it requires the search engines to scour their corpus of billions of documents and do two things – first, return only those results that are relevant or useful to the searcher’s query, and second, rank those results in order of perceived usefulness. It is both “relevance” and “importance” that the process of SEO is meant to influence.

To a search engine, relevance means more than simply finding a page with the right words. In the early days of the web, search engines didn’t go much further than this simplistic step, and their results suffered as a consequence. Thus, through evolution, smart engineers at the engines devised better ways to find valuable results that searchers would appreciate and enjoy. Today, 100s of factors influence relevance, many of which we’ll discuss throughout this guide.

How Do Search Engines Determine Importance?

Currently, the major engines typically interpret importance as popularity – the more popular a site, page or document, the more valuable the information contained therein must be. This assumption has proven fairly successful in practice, as the engines have continued to increase users’ satisfaction by using metrics that interpret popularity.


Popularity and relevance aren’t determined manually. Instead, the engines craft careful, mathematical equations – algorithms – to sort the wheat from the chaff and to then rank the wheat in order of tastiness (or however it is that farmers determine wheat’s value).