0 Google Panda Update: Say Goodbye to Low-Quality Link Building

A while back, I wrote about how to get the best high volume links. Fast forward eight months and Google has made two major changes to its algorithm -- first to target spammy/scraper sites, followed by the larger Panda update that targeted "low quality" sites. Plus, Google penalized JCPenney, Forbes, and Overstock.com for "shady" linking practices.

What's it all mean for link builders? Well, it's time we say goodbye to low quality link building altogether.

'But The Competitors Are Doing It' Isn't an Excuse

This may be tough for some link builders to digest, especially if you're coming from a research standpoint and you see that competitors for a particular keyword are dominating because of their thousands upon thousands of pure spam links.

But here are two things you must consider about finding low quality, high volume links in your analysis:

  1. Maybe it isn't the links that got the competitor where they are today. Maybe they are a big enough brand with a good enough reputation to be where they are for that particular keyword.
  2. If the above doesn't apply, then maybe it's just a matter of time before Google cracks down even further, giving no weight to those spammy backlinks.

Because, let's face it. You don't want to be the SEO company behind the next Overstock or JCPenney link building gone wrong story!

How to Determine a Valuable Backlink Opportunity

How can you determine whether a site you're trying to gain a link from is valuable? Here are some "warning" signs as to what Google may have or eventually deem as a low-quality site.

  • Lots of ads. If the site is covered with five blocks of AdSense, Kontera text links, or other advertising chunks, you might want to steer away from them.
  • Lack of quality content. If you can get your article approved immediately, chances are this isn't the right article network for your needs. If the article network is approving spun or poorly written content, it will be hard for the algorithm to see your "diamond in the rough." Of course, when a site like Suite101.com, which has one hell of an editorial process, gets dinged, then extreme moderation may not necessarily be a sign of a safe site either (in their case, ads were the more likely issue).
  • Lots of content, low traffic. A blog with a Google PageRank of 6 probably looks like a great place to spam a comment. But if that blog doesn't have good authority in terms of traffic and social sharing, then it may be put on the list of sites to be de-valued in the future. PageRank didn't save some of the sites in the Panda update, considering there are several sites with PageRank 7 and above (including a PR 9).
  • Lack of moderation. Kind of goes with the above, except in this case I mean blog comments and directories. If you see a ton of spammy links on a page, you don't want yours to go next to it. Unless you consider it a spammy link, and then more power to you to join the rest of them.

What Should You Be Doing

Where should you focus your energy? Content, of course!

Nine in 10 organizations use blogs, whitepapers, webinars, infographics, and other high quality content to leverage for link building and to attract natural, organic links. Not only can use your content to build links, but you can use it to build leads as well by proving the business knows their stuff when it comes to their industry.

Have You Changed Your Link Building Strategy?

With the recent news, penalties, and algorithm changes, have you begun to change your link building strategies? Please share your thoughts in the comments!

0 Biggest Search Events of 2011 & Predictions for 2012

Everybody's been talking about search in 2011, but what were the events that helped to shape the search landscape of the year?

We ran a poll on SEOptimise in order to find out. While the biggest search impact of 2011 might not come as much of a surprise, some other events were notable by their absence.

Out of eight possibilities, one ranked as the clear leader, with twice the votes of its nearest rival at the last count. So, without further ado, let’s look back at the most notable search events of 2011.

The Google Panda Update

Google's Panda algorithm change was all about improving the quality of search results.

This has caused lots of problems for SEOs and webmasters, with many sites suffering from huge drops in rankings and subsequent traffic as a result. There’s also been no real quick fix to this and for some sites it’s been such a long way back that they’ve had to change their whole business model in order to react!

SSL Search

Secure Sockets Layer (SSL) search allows Google users to encrypt their search queries. Google made this the "default experience" for signed-in users on Google.com in October and, as a consequence, stopped passing query data to analytics software including Google Analytics.

Users began to see "(not provided)" appearing in their Google Analytics data, indicating that the search had been encrypted and the keyword data was therefore not available.

Google have stated that overall this will be a single-digit percentage of keywords that is classed as “(not provided)” – however, from an SEO agency perspective, if you’ve set client targets for increases in non-branded search and are no longer accurately able to measure a full picture of where visits are coming from, they will be some difficulties here. As witnesses by the reaction to this change from the search industry!

Social Signals & Integration

With Twitter and Facebook now well established, LinkedIn covering the business angle, and Google+ still emerging on to the social stage, social signals and integration are impacting our search experience.

Both Facebook and Twitter are now widely integrated into websites, giving companies a 'face' and an easy way to deal with customer feedback, both positive and negative.

LinkedIn perhaps has less of an impact on websites' search rankings, although its highly search-visible profiles offer an easy way for professionals to appear in queries relating to their own name or work experience.

But it's Google+ that holds the potential to change search drastically, providing it can gain enough traction to build a dedicated and regular user base.

The +1 button is already appearing on blogs and websites across the web, and on browser toolbars too, putting search rankings directly in the hands of Google's users for the first time.

Siri

Siri is unarguably impressive. Responding to natural, conversational questions with relevant search results, the voice-activated search function on Apple's iPhone 4S ignited a media furore when it launched.

Yahoo Site Explorer

Yahoo retired its Site Explorer service in November as part of its partnership with Bing, advising its users to head over to Bing Webmaster Tools instead. Site Explorer actually predated Google Webmaster Tools by about a year, and had become a point of reference for many web marketers.

Yahoo Site Explorer had allowed a glimpse into the performance of competitors' sites, and left a genuine gulf among free online services in those terms.

Google Freshness Algorithm Update

Google's Freshness update affected over a third of search results - roughly 35 percent - and is part of the real-time search trend.

It ensures that search queries relating to time-sensitive events, such as the Olympics, are more likely to yield results about recent or upcoming events than about those held a long time ago.

Between 6 and 10 percent of Google search users were expected to notice a change, with other types of content like news and reviews similarly impacted.

Microsoft-Yahoo Search Alliance

The Microsoft-Yahoo Search Alliance gave Microsoft direct access to some of Yahoo's search technologies as part of a 10-year licensing deal. Ostensibly, the alliance was part of an aim to deliver faster, more relevant results to users of both Yahoo search and Bing, with collaboration in other areas like paid search, too.

However, Google remains dominant, and the combo a distant second. And it seems, unlike with Google, web marketers were able to handle the transition smoothly enough that it had no negative effect on their search performance.

Predictions for 2012

So what might we see in the year ahead? Briefly, here’s what I expect:

Plenty More Privacy/Analytics Headaches

The rollout of SSL search from Google has only just started, with an increase in the number of queries affected widely anticipated. However, if the ICO don’t back down on the cookie directive law, this could only just be the start!

If you can only track users who opt in to allowing cookies this will have an extremely significant impact on how we measure website performance via analytics. So this is definitely the big one to look out for in 2012.

Shifting Facebook Demographics

I expect that this will be the year that teenagers leave Facebook in droves. The kind of growth this platform has seen can’t continue – and young people will be the first out of the door. Not only do they currently have to see their parents’ status updates, their parents can see theirs. No teen wants that.

Marketers are going to have to make a real effort to remain on top of this changing market and make sure they know where the teenagers go.

Unification of SEO and PR… With HR

SEO and PR have gradually become more integrated. Expect this trend to continue in 2012. What could be even more interesting will be larger companies using their employees to aid their marketing.

From Twitter, to Facebook, to YouTube – businesses will increasingly ask their employees to get involved in their online promotion. This could blur the boundaries between professional and social profiles, so firms will need to set out ground rules before using their workforce this way.

Tablets Taking Over

For so long the focus has been on mobile, but companies can’t risk missing the latest boat. Tablets are rapidly becoming the norm; eMarketer is predicting there will be 90 million tablet users by 2014.

This could help unify TV and online marketing. Research agency Sparkler found that 51 percent of all tablet use occurs while the owner is watching TV. It’s a downtime device and so in 2012 businesses need to ensure their marketing strategies take advantage of this.

0 Google Announces Panda Updates Will Resume “Next Year”


Google announced via Twitter that Panda Updates have ceased for 2011 and will resume after the New Year.  Perhaps this is an early Holiday gift to webmasters that need to cultivate content and links on their existing webpages.  The last update came in November and all have been considered “minor” updates, each time affecting less than 1 percent of all searches.  It’s important to understand, however, that with over 12 billion searches each month that is still 100,000 effects.
Google first released Panda in February and has released six updates since.  These have all been considered minor updates, although the search engine also noted that there are updates to their search algorithms “almost daily”.  Now, with the writing on the wall, webmasters have a 1 month warningthat the next update should be effective sometime in January, 2012.

These are my top 3 areas to pay attention to:

Keyword Density:  Google has a strict policy when it comes to keyword density ratios thanks to websites that lack quality content and simply publish a bunch of words trying to inflate their rankings.  Make sure the keyword density ratio is between 5–8 percent or Google may catch the page in their next update.  Read over the content and make changes (if necessary) to lower the density ratio if it falls above 8 percent.
Relevancy of Keywords:  In addition to density, pay close attention to the relevancy of the keywords and keyword phrases.  Reputable sites including PW Newswire and Forbes have been affected because they rank for keywords that are often abused by unscrupulous websites.  Some simple ways of determining this is to keep an eye on the domain authority score and try searching for those specific keywords, carefully observing the type of sites that also rank for them.  This will shed light on to the authority of the chosen keywords and keyword phrases.
Fix broken links:  Sites known as “link farms” have abused the right to post relevant links on their webpages in abuse of the search algorithm.  If there is content on the page and are broken links (pointing to URLs that no longer exist), seek to fix or remove them.  While not intending on falsely satisfying the search algorithm, Google may think that is what is happening.
Operating a website and performing regular maintenance are necessary to ensure the page ranks well on Google.  This is a clear advantage of hiring an SEO company.  With Google’s “1-month warning”, webmasters have some time to perform some maintenance to ensure they are providing authoritative content that returns value to the end-user.

0 On-Page Optimization Not Dead: Long-Tail Keywords Increase Rankings, Conversions [STUDY]

On-page optimization for long-tail keywords can result in ranking more than a page higher in search results, compared to half a page optimizing with head terms, according to a study by New York-based SEO & SaaS company Conductor. They also found that long-tail keywords converted 2.5 times better than head terms.

You may remember the uproar from last fall, when SEOMoz purported there was a higher correlation between LDA scores and high rankings than any other factor. Some took this to mean that on-page optimization didn't matter. It’s a topic that still pops up now and again; my on-page optimization isn’t working, I don’t know if it’s worth it... on-page optimization must be dead.

Not so, says Conductor. In their research study The Long Tail of Search, Conductor examined the effects of on-page auditing and optimization for long-tail keywords, versus optimizing for head terms or failing to optimize on-page at all. Not surprisingly, they saw a downward movement of more than two positions for keywords with no on-page optimization.

“Even in 2011 – often at executive prodding – many marketers are still singularly focused on the most searched terms in their industry that are also the most competitive and difficult to move up the search rankings," Conductor CEO Seth Besmertnik told Search Engine Watch. "As our study shows, we think there is huge opportunity in the long tail of search for the savvy search marketer to move up the search rankings more rapidly and convert at a rate that is 2.5 times greater than for head terms.”

Conductor’s research involved thousands of keywords studied over a period of nine months, using the data collection and on-page recommendations of their SEO platform, Searchlight. They first segmented keywords into three groups:

Keywords with shrinking on-page issues (being resolved by SEO).

Keywords with growing on-page issues (not being resolved by SEO).

Keywords with no on-page issues.


On-Page Optimization Crucial for SEO

average-keyword-rank-movement-for-keywords-conductor

Optimizing on-page elements for the keywords marketers want to rank for is critical, according to Conductor's research. On-page optimization for keywords identified by Searchlight as having on-page issues consistently resulted in rankings increases, by an average of 11.24 positions.

Websites with identified issues but no on-page optimization saw a two position drop. Keywords with no identified issues saw a less than one position increase.

Long-Tail Keywords Show Greatest Rankings Increases

average-keyword-rank-movement-head-vs-long-tail-keywords

Recognizing that there are two ways marketers commonly use the word “long-tail,” they looked at query volumes and the number of keywords in a phrase as separate issues and tested twice.

First, they excluded medium-volume keywords for the purpose of this study, focusing on those with either high (head) or low (long-tail) query volumes. In this breakdown, they found that long-tail terms were “significantly” more impacted by on-page optimization, with an 11 position increase, compared to six positions for higher volume, head keywords.

average-keyword-rank-movement-for-head-vs-long-tail

For the second part, they separated keywords according to the number of words in the term; head keywords were one to two word queries, while long-tail terms had three or more words. Again, they found that on-page optimization increased long-term rankings more, but by a smaller margin. With this segmentation, long-tail terms rose an average of six positions and head terms an average of four.

Long-Tail Terms Convert 2.5 Times More

conversion-rates-head-vs-long-tail-terms-conductor

The final part of their study looked at conversion rates, examining more than 7 million visits to three major retailers. Long-tail terms – those with three words or more – converted two and a half times more than head terms. Conductor said this is a great opportunity for marketers who may be disproportionately focusing on higher volume, one- to two-word search terms.

On-page optimization is one of many strategies SEOs and marketers can use to increase rankings and conversions. It’s also just good practice to make sure your page addresses the issues that brought visitors to the site in the first place.

0 How to use Twitter as a lead generation tool

Twitter is among the top three social networking sites today and offers a micro-blogging service that lets its users send, receive and share messages via tweets. Tweets are text messages confined to 140 characters and can include links. Twitter users currently Tweet over 110M Tweets/day, at the time of writing this article. With traffic like that, who can ignore the gold mine that is Twitter?

Twitter – no shortage of business opportunities

It is only natural for internet marketers to leverage Twitter as a lead generation tool. The beauty of Twitter is that it allows easy conversation and for someone who is looking for business opportunities. Obviously, you would not try and sell directly to anyone with the prefix “@”. Keeping your eyes open and being alert to what is going on certainly helps. For example, someone might be looking for a specific service, and if you have the solution, you could probably suggest it to them and end up making a profit.

The short cut to generating leads on Twitter

However, it is hardly practical to browse Twitter all day trying to find leads. Here is where we make use of third party tools to shorten the distance between you and the prospects. Let us take a quick look at what you can leverage:

Twitter advanced search

If you use Twitter, you are probably aware of the basic search function. This lets you save a tremendous amount of time while you run searches using your name, or your business’s name, your brands, your competition and other tags. This will help keep track of your online business reputation as you look at the response to your tweets.
However, the advanced search function is what you should really take advantage of because it lets you run a search for your keywords by geographical location. So, if you live in Jacksonville, KY, and run an advanced search for who is in Jacksonville, KY, you can see who is saying what in that area. You are looking at real-time data you can filter and develop into business opportunities.
When you look at the advanced search form, you will realize that it is quite simple to run a search around a prospective lead that you might have identified. You can also use Twitter’s “search operators” and use the examples to come up with your own searches. Let us say you want to find out who wants your services in Lawrenceville, GA, you would create a search based on those terms or search operators.

Being responsive is the key

Twitter is full of people who post questions, opinions, requirements and just about anything and it is up to the marketer to utilize these tweets. If someone is looking for something, they would be pleased to have someone respond to their need, to identify, filter and respond to people asking questions or complaining. This becomes even more valuable if it is someone from their area. As a marketer, you can plan and set up a series of tweets that could very well convert into hot leads on a daily basis.
Remember, though, that you must take care to see that people don’t consider you a spammer. Instead, identify your niche and target people in that segment by triggering an interesting conversation and being helpful, eventually generating leads.

Not just leads, but a whole network

When you work on generating business leads from Twitter, bear in mind that you can also build a community around you that could yield business partners for different aspects of your business that you could work with.

RSS feeds can be a boon

Every search that you run will bring up RSS feeds you can subscribe to. You can decide which ones are worthwhile from the comfort of your RSS reader. The RSS function keeps your searches well-organized, helping you respond to the ones you find lucrative and follow-up.

You can also use the RSS function to republish the results from your search. If you happen to be planning an event, you can create a buzz around it by doing this. You can create a #hashtag that is specific to your event, run a search on that tag and get the RSS feed. You can then use Feedburner’s BuzzBoost feature under the Publicize tab of your Feedburner feed to publicise the dynamic feed on your site. It is easy.

If you would rather do this right from your browser, use Tweetdeck to see the real time search results on your desktop.

This is only the tip of the iceberg when it comes to leveraging Twitter for lead generation. Do you have your own tips to share? Do post them in the comments section.

0 Conversion Tracking vs. Google Analytics Goals

Conversion tracking is a must-have. If you can, you should.

Simply knowing how many sales you generated for how much spend isn't enough. You need to be able to know which keywords (and better, which search queries) generated the enquiries, sales, leads and phone calls you're interested in.

This means you need a tool that integrates with AdWords, either by letting you get the data out, or by putting its own data in.

Exporting to a Tool

Getting the data out means tagging your landing pages with query strings. This will mean that a solution on your site can read that query string and know whatever information you've given it.

The big advantage of using these kinds of solutions is the range of them available. They can include attribution modeling and user journey paths and all kinds of exciting information. You can integrate them into CRM systems and log the keyword and user all the way through their lifecycle as a customer. Most of these systems will be able to pull keyword data from query strings you tag, and search query data from referral information, then pair these together in the customer's record if they convert.

The downside is that acting on that information in AdWords is a few steps removed. You need to use multiple different systems to see the data and take actions, such as increasing or decreasing bids based on conversion rate. Convenience and cost are the main disadvantages of these systems.

There are two options that use the opposite approach: put conversion data into AdWords to analyze there. AdWords' own conversion tracking system, and Google Analytics will both do this, and both are free. Let's take a look at how they're different and why you might choose each option.

Importing Into AdWords

Importing the data straight into AdWords has one major advantage over using separate tools: the data you need to optimize from is right next to the areas you need to change in order to optimize. You can view conversion volumes and cost per conversion next to each keyword or ad, and you can make changes appropriately directly in the interface. If you have enough conversion volume you can also take advantage of Google's "conversion optimizer".

AdWords Conversion Tracking

AdWords' system involves a snippet of JavaScript that you place on the confirmation page after the user has taken the action you want to track. The users are completely anonymized: you can see the search query used, the ad seen, and the keyword that triggered the ad; but none of this can be logged to the individual conversion. With a little customization of the code you can include a value to any conversion if appropriate.

adwords-conversion-code

There is typically a 30-day window on these conversions. Any visitor who has clicked on one of your Google ads in the 30 days before they converted will be tracked as a converted click in AdWords. The conversion will be displayed at the date and time of the original click. So if visitor A searches and clicks on Monday, then comes back to purchase on Friday, the conversion will register on Friday, showing up in Monday's stats.

AdWords tracks conversions on a last-click wins basis (amongst AdWords clicks only, no other traffic sources are included). So you will find that if a searchers clicks ads from several different keywords before they convert, the credit will only go to the final keyword searched on. This will often lead AdWords to assign a higher weighting to your brand terms than you might expect.

AdWords contains a "Search Funnels" feature to let you see all the AdWords touchpoints before a user converted. You'll be able to see the average time lag of a user from click to conversion, but more helpfully you'll also be able to see the keywords that "assisted" conversions (e.g., were not the last click but were still part of the user's path). These keywords would have received no credit in the main AdWords interface.

adwords-search-funnels

Importing Goals From GA

If you have Google Analytics set up with goals recording conversions (or e-commerce tracking), then you can import these into AdWords directly.

Goals imported from GA give you some additional features compared to AdWords tracking, but come with some differences that you'll need to know about.

First, Google Analytics is a last-click-wins system across all traffic sources, apart from direct traffic. That means that if a user goes through an AdWords ad but then comes back another way, GA won't log that as an AdWords conversion. You will expect GA goals to under-track a little compared to AdWords tracking for this reason.

Second, GA records conversions to the date and time of the visit that converted, as opposed to AdWords' method of recording them to the click that generated the conversion. This can be much more different than you think. Even in FMCG it's not unexpected to see click to conversion lags of a few days or weeks. In retail you should expect fewer than 70 percent of people to purchase on the same day as the click, so that's quite a big difference in daily conversions between AdWords (today's conversions being tracked back to whenever they clicked) and GA (today's conversions being recorded today).

Goals imported from Google Analytics can take up to 48 hours to appear in AdWords, so you may not be able to immediately see the effect of your optimizations without going into GA directly.

Why Use One Over the Other?

These differences in behavior may indicate to you which type of conversion tracking is more suitable for you depending on your overall preferences.

  • If you want any traffic going through AdWords to be classed as an AdWords conversion, AdWords conversion tracking would be the preference.
  • If you want only last-click AdWords conversions to be tracked, import from Google Analytics.
  • If you want full attribution modeling, then do your tracking in a third party tool (I'm classing multi-channel funnels in GA as a third party tool here, since the data from there can't be imported directly into AdWords, but must be analyzed separately).

Google Analytics holds one trump card still: engagement goals. GA will let you set a threshold for certain metrics (e.g., pages viewed or time on site) and set any visit that goes over that threshold as a goal. For certain content sites these are really worthwhile. Non-bounce visits could be a great signal to optimize for if you run a site that has no other real tracking options.

non-bounce-visit-goal

You can see that each of these methods offers different benefits and drawbacks, and in some cases different biases in the data you'll see that you need to account for. Those biases can be pretty large, so don't be surprised if different conversion tracking sources don't match, and make sure you understand why each might be tracking something in a slightly different way.

Just adding conversion tracking isn't enough, you need to have thought about which to implement and what that means for you.

0 Listing Dynamic Web Pages in Search Engines


Problems With Dynamic Languages

Most search engines do not like to follow java script or spider deeply into dynamic driven websites. Since scripts can create millions and millions of pages on the fly search engines need to ensure some level of quality before they will want to list all the pages.

Session ID's & Cookies

Search engines do not like reading session IDs or cookies. When a search engine is given a session ID they will usually leave the site. If spiders indexed websites that offered the spider a session ID it would frequently cause the spiders to overload the server. Session ID's would also make the site seem many times larger than it is.

General Dynamic Page Listing Tips

Search engines are getting better at listing dynamic web pages. There are a few basic thumb rules to help search engines index your dynamic website.
  • Build a linking campaign. As you get more inbound links search engine spiders will have more reason to trust the quality of your content and they will spider deeper through your site.
  • Use short query strings. Most search engines will do well to list dynamic pages if each query string is kept less than 10 digits.
  • Minimize the number of variables. When possible you want to use three or less different parameters, the fewer the better. If you use long parameter strings and a half dozen parameters it is a fair bet that the site will not get indexed.
  • If you still have problems you can use a CGI script to take the query string out of your dynamic content or try one of the other dynamic page workarounds listed below.
  • Other dynamic workarounds: There are multiple ways to list dynamic web pages in search engines. Common techniques are:
    • Hard coded text links
    • Bare bone pages
    • Coldfusion site maps
    • Apache Mod Rewrite
    • IIS Rewrite Software
    • Paid Inclusion

Hard Coded Text Link

To list dynamic web pages in search engines you can capture the entire url in a link like:
<a href="http://www.search-marketing.info/catalog.html?item=widget&color=green&model=6">Green widgets style 6</a>
If you use static links like listed above to reference dynamic pages, search engines will usually be able to index them. Many sites use a site map which captures the most important interior pages.
If you have enough link popularity and link into a couple of your main category pages using static links then search engines will usually be able to index the rest of the site.

Bare Bones Pages

You also can make a bare bones static page for each one you want listed in search engines.
<html>
<head>
<title>Green Widgets style 6</title>
</head>
<body>
<!--#exec cgi="myscript.pl?greenwidget-6"-->
</body>
</html>

Coldfusion Site Map

For ColdFusion you can code a site map page using a code similar to the following.
<cfquery name="getPages" datasource="myDatasource">
SELECT *
FROM pages
WHERE status = 1
ORDER BY sortorder
</cfquery>
<cfoutput query="getPages">
<a href="index.cfm?pageID=#getPages.pageID#">#getPages.pageName#</a>
</cfoutput>

Apache Mod Rewrite

For Apache servers there is a way to make dynamic pages seem like static pages called Mod Rewrite. The documentation on MOD Rewrite is located on the Apache website. Apache MOD Rewrite.

IIS Rewrite Software

IIS servers do not have the built in rewrite features like Apache Mod Rewrite. You still can rewrite your URL's on IIS servers using custom built software programs.

Trusted Feed Paid Inclusion

You can pay to use trusted feeds to upload dynamic content to Yahoo! to acquire traffic at a cost per click bases through the Overture Site Match program. I would consider this a last option for many sites since many business models can not support the incremental cost per click charges.

0 Opinion: 3 Onsite SEO Myths and Facts – What Really Counts?

Before starting this article I would like to note that I am specifically talking about Google. The information below might not apply to other search engines.

Everybody who is into SEO knows that it is more than just link building and offsite techniques. Sure, links matter the most, but how about your website itself? Onsite optimization might not be the most important part of SEO according to some people but that just depends on the point of view. To me, there have always been some onsite factors that play a significant role in the eyes of Google.

Of course, the most important thing when optimizing a website according to link builders is getting links from website with a high TrustRank and the most important thing according to web designers is the proper coding. Since me and the people I work with focus on SEO as a complete process, we concentrate on everything important.

However, there are some things that just don’t matter as much as others especially when doing onsite SEO. Google changes its algorithm almost every day so a lot of the old onsite techniques that once worked are now useless thanks to the so many spammers exploiting them. So what exactly are the onsite factors that can affect your rankings?

WEBSITE TITLE

The title of your website is one of the most important things when it comes to onsite SEO and better rankings. Here is what people believe to be true and what the truth really is:

Myth

A common mistake that people make when doing onsite optimization is stuff keywords in the title of their website thinking that would help them rank better. Keyword stuffing was a technique that was kind of effective a long time ago until Google found out that the spammers are using it to their advantage. So Google decided to change their algorithm, making a website’s ranking depend more on links and less on onsite factors.

Fact

The title of your website matters a lot and if you don’t want to get a penalty, you need to keep it simply and user-friendly as well as Google-friendly. This is the place where you get to describe your whole website/business in about 10-15 words. I am not saying you should not put keywords in there. Quite the opposite – put your most important keywords in the title but make sure you put them in a way that is not spammy looking instead of just stuffing them and separating them with commas.

Tips

When writing your website title, be creative and don’t repeat anything more than once. For example, if you are selling silver and gold jewelry, writing “Silver jewelry, gold jewelry…” in your title is not a good idea. Instead use the phrase “Silver and gold jewelry”. You should know that Google doesn’t care about the order of the words and you will get credit for each permutation.

URL STRUCTURE

The most obvious thing is the domain name. If your domain name is an exact match for your most competitive keyword – you’re on. However, the rest of the URL structure is also a very important onsite factor and many people are still doing it all wrong.

Myth

Again, a very common myth is that stuffing as many keywords as possible in the name of each page will work.

Fact

A website with a better URL structure has an advantage over a website with dynamic URLs. Although dynamic URLs can be read by Google, they simply don’t have the same effect as properly structured ones.

Tips

When taking care of your URL structure, the most important thing is to name each page of your website with the most relevant keyword. Creating many pages with different names that are also your keywords will pay off better than having dynamic URLs.

AGE

The age of a website is another factor that plays a big role when it comes to its rankings but not in a way that some people think.

Myth

A lot of people believe that a website will get better rankings with time on its own. So their strategy is to just sit there and wait because they believe that a website that is 3 years old should automatically rank better than a website that is 4 months old no matter what. They believe that if no offsite optimization has been done to the old website it will still have better ranking than a new website with a lot of backlinks for example.

Fact

The age of a website does matter to Google. However your website will not rank high just because it’s old. The only thing that is affected by the site age is the amount of TrustRank it gets from the links pointing to it. The first two months, you will most likely not rank at all in Google because you will be in the “sandbox” where all new websites go. Then you will start receiving a bigger percentage of TrustRank as your website gets older. 4 years after the creation of your website, you will start receiving 100% of the TrustRank that the links pointing to your website pass.

Tips

Just because your website will be in the sandbox for the first two months, doesn’t mean you should sit and wait for the time to pass and then start building links. Instead, use the time to get some links and enjoy the credit Google will give you for them when the trial period is over.

Conclusion

These are 3 of the most important onsite SEO factors you should focus on, but I want to touch on two other factors people still think matter, the XML sitemap and the coding. Just to be clear – this article is about which onsite factors help you get better rankings and not about what makes Google’s job easier. Of course the XML sitemap is a really great thing and it sure helps Google crawl your website faster and therefore index it faster. However your sitemap has nothing to do with your rankings at all nor does the coding of your website.

Concentrate on what is really important and don’t worry about things web designers and other charlatans tell you in order to get more money from you.

0 Google Algorithm Updates: The Latest Things To Consider

Google algorithm "transparency" continues

Google has been making a big deal about wanting to be more transparent about its search algorithm lately (without revealing the secret sauce too much of course). And so far, I have to say they're making good on that promise fairly well.

Is Google being transparent enough for your liking?

We've seen plenty of algorithmic announcements made from the company over the course of the year. In November, they discussed ten recent changes they had made. Here's a recap of those:

  • Cross-language information retrieval updates: For queries in languages where limited web content is available (Afrikaans, Malay, Slovak, Swahili, Hindi, Norwegian, Serbian, Catalan, Maltese, Macedonian, Albanian, Slovenian, Welsh, Icelandic), we will now translate relevant English web pages and display the translated titles directly below the English titles in the search results. This feature was available previously in Korean, but only at the bottom of the page. Clicking on the translated titles will take you to pages translated from English into the query language.
  • Snippets with more page content and less header/menu content: This change helps us choose more relevant text to use in snippets. As we improve our understanding of web page structure, we are now more likely to pick text from the actual page content, and less likely to use text that is part of a header or menu.
  • Better page titles in search results by de-duplicating boilerplate anchors: We look at a number of signals when generating a page's title. One signal is the anchor text in links pointing to the page. We found that boilerplate links with duplicated anchor text are not as relevant, so we are putting less emphasis on these. The result is more relevant titles that are specific to the page's content.
  • Length-based auto complete predictions in Russian: This improvement reduces the number of long, sometimes arbitrary query predictions in Russian. We will not make predictions that are very long in comparison either to the partial query or to the other predictions for that partial query. This is already our practice in English.
  • Extending application rich snippets: We recently announced rich snippets for applications. This enables people who are searching for software applications to see details, like cost and user reviews, within their search results. This change extends the coverage of application rich snippets, so they will be available more often.
  • Retiring a signal in Image search: As the web evolves, we often revisit signals that we launched in the past that no longer appear to have a significant impact. In this case, we decided to retire a signal in Image Search related to images that had references from multiple documents on the web.
  • Fresher, more recent results: As we announced just over a week ago, we've made a significant improvement to how we rank fresh content. This change impacts roughly 35 percent of total searches (around 6-10% of search results to a noticeable degree) and better determines the appropriate level of freshness for a given query.
  • Refining official page detection: We try hard to give our users the most relevant and authoritative results. With this change, we adjusted how we attempt to determine which pages are official. This will tend to rank official websites even higher in our ranking.
  • Improvements to date-restricted queries: We changed how we handle result freshness for queries where a user has chosen a specific date range. This helps ensure that users get the results that are most relevant for the date range that they specify.
  • Prediction fix for IME queries: This change improves how Autocomplete handles IME queries (queries which contain non-Latin characters). Autocomplete was previously storing the intermediate keystrokes needed to type each character, which would sometimes result in gibberish predictions for Hebrew, Russian and Arabic.
 

Complete Tutorial of SEO Copyright © 2012