This is default featured slide 1 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured slide 2 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured slide 3 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured slide 4 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured slide 5 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Thursday, July 5, 2012

The Holy Grail of Panda Recovery – A 1-Year Case Study

Prospective client: Help! I’ve fallen, and I can’t get up!
Me: Where are you?
Prospective client: Massachusetts
Me: I’m 3,000 miles away! Call 911!
Prospective client: No, it’s my website! We lost 50% of our traffic overnight! Help!
Me: How much can you afford to pay?
Okay, that’s not actually how the conversation went, but it might as well have, given the fact that I came in and performed a strategic SEO audit, the client made all the changes I recommended, and, as a result, a year later they’ve increased organic traffic by 334%!

Now, when you look at that data, you might assume several things. Ryan Jones saw those numbers and tweeted just such an assumption:

Rather than leaving people assuming what was done to get such insanely great results, I figured it would make sense to write an actual case study. So once again, I’m going to ask you to indulge my propensity for excruciatingly long articles so that I can provide you with the specific insights gained, the tasking recommended, the changes made, and the lessons learned.

First Task – Look at the Data for Patterns

One of the first things I do when performing a strategic SEO audit is look at the raw data from a 30,000-foot view to see if I can find clues as to the cause of a site’s problems. In this case, I went right into Google Analytics and looked at historic traffic in the timeline view going back to the previous year. That way I could both check to see if there were seasonal factors on this site, and I could compare year-over-year.
That’s when I discovered the site was not just hit by Panda. The problem was much more deeply rooted.

The MayDay Precursor


The data you see above is from 2010. THAT major drop was from Google’s MayDay update. As you can see, daily organic visits plummeted as a result of the MayDay update. That’s something I see in about 40% of the Panda audits I’ve done.
From there, I then jumped to the current year and noticed that, in this case, MayDay was a precursor to Panda. Big time.

What’s interesting to note from this view is the site wasn’t caught up in the original Panda. The hit came the day of Panda 2.0.
Okay, let me rephrase that. Both of my claims about the site being hit so far, MayDay and Panda 2.0, are ASSUMPTIONS. Because we work in a correlation industry and since Matt Cutts refuses to poke this SEO bear, I doubt he’ll ever give me a personal all-access pass to his team’s inner project calendar.

Correlation-Based Assumptions

Since we don’t have the ability to know exactly what causes things in an SEO world, we can only proceed based on what Google chooses to share with us. This means it’s important to pay attention when they do share, even if you think they’re being misleading. More often than not, I have personally found that they really are forthcoming with many things.
That’s why I rely heavily on the work of the good Doctor when it comes to keeping me informed because I can’t stay focused on Google forums info. I’m talking about the work Dr. Pete has been doing to keep the SEOMoz Google Update Timeline current.

An Invaluable Correlation Resource

The SEOmoz update timeline comes into play every time I see any significant drop in a site’s organic traffic after they call me in to help them. I can go in there and review every major Google change that’s been publicly announced all the way back to 2002. You really need to make use of it if you’re doing SEO audits. As imperfect as correlation work can be, I’ve consistently found direct matches between major site drops and people coming to me for help with entries in that timeline.

The Next Step – Is It Actually a Google-Isolated Issue?

Once I see a big drop in the organic traffic for a site, I dig deeper to see if it’s actually a Google hit or if it might be all search engines or even a site-wide issue. Believe me, this is a CRITICAL step. I’ve found some sites that “appeared” to be hit by search engine issues, but it turned out that the site’s server hit a huge bottleneck. Only by looking at ALL traffic vs. Organic Search vs. Google specific will you be able to narrow down and confirm the source.
In this case, back during the Panda period, organic search accounted for close to 80% of site-wide traffic totals. Direct and Referrer traffic showed no dramatic drop over the same time period, nor did traffic from other search sources.

Curiosity Insight – Google Probes Before Big Hits

Not always, but once in a while when Google’s about to make a major algorithm change, you’ll be able to see (retroactively in hindsight) that they turned up the crawl volume in big ways.


The SEO Audit – Process

Here’s where the rubber meets the road in our industry. You can wildly guess all you want about what a site needs to do to improve. You can just throw the book at a site in terms of “the usual suspects.” You can spend a few minutes looking at the site and think, “I’ve got this.” Or you can spend hundreds of hours scouring thousands of pages and reviewing reams of spreadsheet reports and dashboards, all to come up with a plan of action.
Personally, I adopt the philosophy that a happy middle ground exists, one where you look at just enough data, examine just enough pages on a site, and spend just enough time at the code level to find those unnatural patterns I care so much about because it’s the unnatural patterns that matter. If you’ve examined two category pages, you’re almost certainly going to find that the same fundamental problems that are common to those two pages will exist across most or all of the other category pages. If you look at three or four product pages (randomly selected), any problems that are common to most or all of those pages will likely exist on all of the thousands or tens of thousands of product pages.
So that’s what I do. I scan a sampling of pages. I scan a sampling of code. I scan a sampling of inbound link profile data and social signal data.

Comprehensive Sampling

Don’t get me wrong. I don’t just look at a handful of things. My strategic audits really are comprehensive in that I look at every aspect of a site’s SEO footprint. It’s just that at the strategic level, when you examine every aspect of that footprint, I’ve personally found there’s not enough of a significant gain when performing more than a sampling review of those factors to justify the time spent and the cost to clients. The results that come when you implement one of my audit action plans has proven over the last several years to be quite significant.

Findings and Recommendations

For this particular site, here’s what I found and what I ended up recommending.
The site’s Information Architecture (IA) was a mess. It had an average of more than 400 links on every page, content was severely thin across most of the site, and the User Experience (UX) was the definition of confused.
Top Half of the Pre Audit Home Page
Note that the screen shot above just displays the TOP HALF of the pre-audit homepage. Put yourself in a user’s shoes. Sure, some users might find a homepage with this design gives them a lot of great choices for finding information. In reality, all this did was confuse most users. They spent a lot of time trying to digest all the options, and the overwhelming majority of users never actually clicked on most of those links.
That’s why click-through data is so vital at this stage of an audit. Back when I did this, Google didn’t yet have their on-page Analytics to show click-through rates, and I didn’t want to take the time to get something like Crazy-Egg in place for heat maps. Heck, I didn’t NEED heat maps. I had my ability to put myself in the common user’s shoes. That’s all I needed to do to know this site was a mess.
And the problems on the homepage were repeated on the main category pages, as well.
Pre-Panda Recovery Category Page

Topical Confusion of Epic Proportions

Note the little box with the red outline in the above screen capture. That’s the main content area for the category page, and all those links all around it are merely massive topical confusion, dilution, and noise. Yeah, and just like the homepage, you’re only looking at the TOP HALF of that category page.
Duplicate content was ridiculous. At the time of the audit, there were more than 16,000 pages on the site and thousands of “ghost” pages generated by very poor Information Architecture, yet Google was indexing only about 13,000. Article tags were automatically generating countless “node” pages, and duplication along with all the other issues completely confused Google’s ability to “let us figure it out.”

Site Speed Issues

One of the standard checks I do when conducting an audit is run a sampling of page URLs through Pingdom and URIValet to check on the speed factor. Nowadays I also cross reference that data with page level speed data provided right in Google Analytics, an invaluable three-way confirmation check. In the case of this site, there were some very serious speed problems. We’re talking about some pages taking 30 seconds or more to process.
Whenever there are serious site speed issues, I look at Pingdom’s Page Analysis tab. It shows the process time breakdown, times by content type, times by domain (critical in today’s cloud-driven content delivery world), size by content type, and size by domain. Cross reference that to URIValet’s content element info, and you can quickly determine the most likely cause of your biggest problems.

When Advertising Costs You More Than You Earn from It

In the case of this site, they were running ads from four different ad networks. Banner ads, text link ads, and in-text link ads were on every page of the site, often with multiple instances of each. And the main problem? Yeah, there was a conflict at the server level processing all those different ad block scripts. It was killing the server.

Topical Grouping Was a Mess

Another major problem I found while performing the audit was not only was there a lot of thin content and a lot of “perceived” thin content (due to site-wide and sectional common elements overcrowding the main page content), but the content was also poorly organized. Links in the main navigation bar were going to content that was totally not relevant to the primary intent of the site. Within each main section, too many links were pointing to other sections, and there was no way search engines could truly validate that “this section is really about this specific topical umbrella.”

Topical Depth Was Non-Existent

Even though the site had tens of thousands of pages of content, keyword phrases had been only half-heartedly determined, assigned, and referenced. Match that factor to the over-saturation of what was ultimately links to only slightly related other content, a lack of depth in the pure text content on category and sub-category pages, sectional and site-wide repetition of headers, navs, sidebars, callout boxes, and footer content, and the end result was that search engines couldn’t really grasp that there really was a lot of depth to each of the site’s primary topics and sub-topics.

Inbound Link Profile

A review of the inbound link profile showed a fairly healthy mix of existing inbound links; however, there was a great deal of weakness in volume specific to a diversity of links using anchor text variations, especially when comparing the site’s profile to that of top competitors.

Putting Findings Into Action

Another critical aspect of my audit work is taking those findings and putting them together in a prioritized action plan document.  Centric to this notion is knowing what to include and what to leave out. For a site with this much chaos, I knew there was going to be six months of work just to tackle the biggest priorities.
Because of this awareness, the first thing I did was focus my recommendations on site-wide fixes that could be performed at the template level. After all, most sites only have one, two, or a few primary templates that drive the content display.
From there, I leave out what I consider to be “nice to have” tasking, things a site owner or the site team COULD be doing but that would only be a distraction at this point. Those are the things that can be addressed AFTER all the big picture work gets done.

Tasking for This Site

After the rest of my audit review for this site, I came up with the following breakdown of tasking:
Improved UX
  • Eliminated off-intent links from main nav
  • Reordered main nav so most important content nav is to the left
  • Reduced links on home page from 450 down to 213
  • Rearranged layout on home page and main category pages for better UX
Addressed server speed and script noise issues related to ad networks
  • Removed in-text link ads
  • Reduced ad providers from four networks down to two
  • Reduced ad blocks from seven down to five on main pages
  • Reduced ad blocks on sub-cat and detail pages from nine down to seven
  • Eliminated ad blocks from main content area
Refined keyword topical focus across top 100 pages
  • Rewrote page titles
  • Modified URLs and applied 301s to old version
  • Rewrote content across all 100 pages to reflect refined focus
  • Reorganized content by category type
  • Moved 30% of sub-category pages into more refined categories
  • Eliminated “node” page indexing
Inbound Link Efforts
  • Called for establishing a much larger base of inbound links
  • Called for greater diversity of link anchors (brand, keyword and generic
  • Called for more links pointing to point to deeper content
________________________________________________________________
Here’s a screen capture of the top half of the NEW, IMPROVED home page.
Post Panda Home Page top half
Yeah, I know. You probably have an opinion on colors, aesthetics, and whatnot. I do, too. However, the issue wasn’t about certain aesthetics. That is exactly the kind of issue that can be dealt with AFTER all the priority work happens.

Massive Changes Take Time

Given how much work had to be done, I knew, as is usually the case with action plans I present to audit clients, that they were in for a long overhaul process that would last several months. I knew they’d start seeing improvements as long as they followed the plan and focused on quality fixes, rather than slap-happy, just-get-it-done methods.

Results Are Epic

The results that have occurred over the course of the past year have been truly stunning to say the least. I knew the site would see big improvements in organic data. I just had no idea it would be on this scale.

Here are three screen captures of the actual Google Data, just so you can see I’m not making all of this up.

The Panda Effect
Post Panda – Audit Work Starts Rolling Out
What A Difference Sustainable SEO Makes

Takeaways

As you might venture to guess, the recovery and subsequent, continued growth has been about as “best case scenario” as you can get. It truly is one of my proudest moments in the industry. The fact that not one single other site had been reported as having yet recovered at the time I performed the audit just means more than words can describe.
Sustainable SEO Really Does Matter
The biggest takeaway I can offer from all of this is that sustainable SEO really does matter. Not only has the site rebounded and gone on to triple its organic count, it has sailed through every other major Google update since. What more can you ask for in correlation between the concept of sustainable SEO best practices and end results?
UX – It’s What’s for Breakfast
I drill the concept of UX as an SEO factor into every brain that will listen. I do so consistently, as do many others in the search industry. Hopefully, this case study will allow you to consider more seriously just how helpful understanding UX can be to the audit evaluation process and the resulting gains you can make with a healthier UX.
IA – An SEO Hinge-Pin
With all the IA changes  made, Google now indexes close to 95% of the site content, more than 17,000 pages versus the 13,000 previously indexed. Their automated algorithms are much better able to figure out topical relationships, topical intent, related meanings, and more.
Ads as Shiny Objects
This was a classic case study for many reasons, one of which is the reality that a site owner can become blinded to the advertising revenue shiny object. Yet there comes a point where you cross over the line into “too much.”  Too many ads, too many ad networks, too many ad blocks, too many ad scripts confusing the server…you get the idea. By making the changes we made, overall site revenue from ads went up proportionally, as well, so don’t let the concept that “more is always better” fool you.

Not Provided Considerations – Long Tail Magic

This observation applies to just about any site, but it’s more of a comfort for medium and large-scale sites with thousands to millions of pages.
If you read the first spreadsheet I included here, you’ll note that it appears that the total number of keywords used to discover and click through to the site spiked and then leveled off. However, if you were to look more closely at that data, you’d also observe that the plateau is directly related to both “Not Provided” and “Other,” with “Other” actually accounting for more keyword data being in one bucket than “Not Provided” (approximately 13% vs. 8%).
The more critical concept here is even though you may in fact be missing opportunities due to those two buckets, just look at the other aspect of the keyword count.

First, by cleaning up and refining the topical focus on the 100 most important pages and instilling proper SEO for Content Writing into the client mindset, we’ve taken the long tail to a whole new level, a level that people may have difficulty comprehending when I talk about REDUCING the number of keywords on a page.
Yet it’s that reduction and refinement, coupled with truly high-quality content writing that is GEARED TOWARD THE SITE VISITOR that allows search engines to actually finally do  a really good job of figuring out that all the variations for each page and each section of the site really do match up from a user intent perspective.
If you were to unleash the “Other” and “Not Provided” phrases into the total count, you’d see that the site now generates visits via more than a million different phrases over the past six-month period. A MILLION PHRASES. For all the potential opportunities in “Not Provided,” how much time would YOU be able to spend analyzing over a million phrase variations looking for a few golden nuggets?
Personally, my client and I are quite content with not being able to dig into that specific metric in this case. So while it’s a bigger challenge for much smaller sites, when you’re at this level, I’m more concerned that Google says, “You have so much data we’re just going to group hundreds of thousands of phrases into ‘Other’.”

In Conclusion

Well, there you have it, my very first Panda Recovery Audit a year later. And for the record, the findings and recommendations I detailed in this case study were presented at SMX Advanced 2011. At the time, some of the feedback I got was “That’s nice, but it’s all speculation. Nobody’s told us this is for sure how to fix Panda.” #WIN

How to Generate Serious Income Using Content You’ve Already Written?

Your content is fantastic. It’s so good that people would pay to read it.
You probably think that sounds crazy. Why would people pay to read something they could read for free? Luckily for you, it’s not crazy, and it’s easy to make money using the content that’s already on your website.

You don’t have time so why would they?

You know what it’s like to run a business and still make the time to seek out great articles that can take the business forward. It’s hard work. It takes time.
Everyone else feels the exact same way. They don’t want to waste their time repeatedly searching through your website looking for all the great content.
That’s why you should make it easily accessible to them. Take all your great content and package it together.

There’s no easier way to make money

Once you’ve packaged all your content together into a book, people will be happy to pay you money for taking away the trouble of searching it out.
You’re saving people time. That means you’re saving people money. They could spend hours and hours searching for what they need, or they could pay you $20 to have it all in one place they can refer back to.
That’s $20 for information that’s already available for free. When you multiply that by the amount of people who are looking to save time you end up with a lot of money.

Make it so good they’ll be back

This isn’t going to take long, but you need to make sure that you make the book attractive to the customer. Don’t just take a bunch of articles and throw them together.
Carefully compile them in a way that makes it look like a proper book. This means you’re going to have to edit it. Make sure the chapters flow together.
Have internal hyperlinks so they can easily reach anything as quickly as possible. Now they know that your book delivers the goods.

What’s the easiest way to make a sale?

Guess what happens when you offer them a new book? They don’t even think about it. They will automatically buy it because they know they’ll waste money if they don’t.
You must know by now that it’s much easier to sell a product to an already satisfied customer. That’s why you need to make sure you always deliver to the best of your abilities.

Spread your content further using the Kindle store

It’s one thing selling to people who already visit your site, but how can you take advantage of your free information and sell it to a much broader audience?
Easy. Take the exact same content in your book and add it to the Amazon Kindle store. The best part is, it’s totally free.
Amazon is one of the world’s biggest search engines and putting your book into the Kindle store automatically makes it accessible to millions of people who search Amazon every day.
Formatting your book for Kindle will take you a few hours at most. Now all you have to do is sit back and make money by doing nothing. Money that you didn’t even realize you could make.

How to Spread Bet?

If you want to begin making money spread betting, the first thing that a new trader needs to do is begin learning how spread betting works, how financial markets operate and, finally, how these can be traded. These three elements are fairly simple to research online and most decent trading websites have great introductory sections for new traders to learn all about financial markets. If we assume that a trader has completed the first two elements of learning how spread betting works and how financial markets operate thoroughly, then the next stage is where the excitement begins.

Make Sure You Understand the Risk First

Although the final stage (the actual spread betting in real-time markets) is the most fun it is also the most risky and the place where a promising spread betting career can quickly tumble and disappear. In order to remove this possibility, traders need to understand the concept of risk management and how to prevent losses. This should be learnt thoroughly when learning how spread betting works and importantly the realisation that it carries some risks as well as the possibility of large profits. With this in mind a trader can prepare themselves to learn how to spread bet.

Getting an Account

The first thing that a trader needs to do before learning how to spread bet is to set up an account. This is incredibly easy with accounts available online and registration for most of these only taking a few minutes and a small trading deposit required of usually less than £100. Many accounts also offer great introductory incentives to new traders such as cash back, education packages and low-stake trading for the first few weeks. It is certainly worth shopping around for the best deal to suit your spread betting needs. Once the account has been opened it is a case of getting to grips with the platform and learning how to simulate opening a trade and using the charting packages that the platform offers.

Placing a Trade

The next stage, before placing a single trade, is to devise a trading strategy. This is a set of rules based on the system that you are going to trade. Without this you will simply be blindly gambling and you will see your account depleted in a matter of trades. A solid strategy can be tested to show that it will make money before real money is used. Once this has been tested it can be traded strictly following the rules that you design. All successful spread betters put much of their trading success down to being able to follow their strategies strictly in every single trade.
Citations:

The best link building strategies

Link building is one of the most important search engine optimization strategies. It improves your search engine rankings and at the same time directs traffic to your website (prospective customers can use the back links to connect to your e-commerce site). In addition, the marketing strategy helps you build a good reputation by providing quality content on the submission directories. Here are some effective link building tips you would like to consider.
How to make quality article submissions
Article submission is one of the ways online marketers use to build back links. As the name implies, the technique involves writing then submitting your articles to reputed directories which have high authority with the search engines. You can either write the articles by yourself or outsource the task to online freelancers. The later is considered beneficial and time saving by most internet marketers. Besides, freelancers are capable of generating high quality content needed to entice prospective customers into visiting your website.
Remember you need to build as many links as possible to help improve you search engine rakings. For this reason, it is important to submit your articles to many directories with each giving you a back link. Sometimes, manual submission becomes monotonous and tiring. If this is so, consider the services of a directory submissions firm.
Link building with social networking websites
Apart from articles, you can also build links using the social network sites. You just have to maintain a good rapport with your followers by providing them with useful information about your products or services. Using social networks to build your links is seen as a cost effective strategy since most of them are free; besides, you do not need to hire ghost writers. For the best results, use language which relates well with your target audience.
Marketing videos and their essence
Submission of marketing videos to popular directories like you tube is an emergent technique worth consideration. You can make short videos highlighting the benefits of your products and use it on the directories to create back links. For the best results, emphasize on content quality and preferably hire an adept animation professional capable of making attractive ads. Videos are more convincing to prospective customers and best used where one wants to increase the traffic flow to their website.
Maximizing profits with link building
You will realize a considerable increase in traffic once you employ the link building tips discussed above. However, it will be of no use if prospective customers visiting your website fail to make transactions. If your site does not possess the qualities you paint in your articles and other content, a prospective customer will navigate away and maybe leave negative feedback. For this reason, you should first improve the appearance and functionality of your website before embarking on seo techniques like link building.
Sometimes a marketer might find it hard to handle the task individually. At this point, one can consider hiring a link building firm for efficiency and convenience.

10 Of The Best Infographics On SEO

An infographic basically describes a graphic that displays statistics, data or copy. Infographics provide a significant amount of information in a concise format. Because an infographic is visually appealing, it captures the attention of an audience faster than simple copy or prose. The medium is evolving to include a variety of different and informative communication tools.
Google Panda
This infographic provides readers with information about low ranking websites based upon user feedback on the website. The infographic primarily depicts an evolution of Google’s algorithm used to evaluate the quality of the websites.
SEO versus PPC
DIY SEO depicts the importance of company’s presence in the top two or three search engine ranking position (SERP). The infographic also describes the number of people who click on a PPC click. User search behavior is also presented in the graphic.
SEO versus PPC 2
Information about why companies need SEO services is included in this infographic. Guidance is the primary purpose of the infographic. The infographic informs companies about the number of people who make online purchases and who use search engines around the world. The likelihood of a user clicking on the first three search engine results is also explored.
Learn How Google Works
This infographic uses a diagram to describe how the Google algorithm ranks websites. Experts cite this attribute as important for understanding SEO.
Understanding Google Page Rank
Understanding Google page rank is an essential component of the SEO process. Information about the ranking tiers is provided with vibrant and memorable graphics.
Visual Guide to SEO
This visual representation indicates how Google ranks pages in a hierarchical format. Page optimization and link building are two primary topics of these websites.
Blog Design for Killer SEO
Websites with RSS feeds and social network accounts with a significant amount of activity are valuable for companies desiring SEO services. This concept is explained.
SEO FAQ
This infographic is more of a guide than a FAQ. The informative visual representation will answer common questions about link building, keyword research, page optimization and site architecture and structure.
ROI of SEO
The return on investment based on the number of sales as a direct result of clicks is depicted in this infographic. Tactical ROi is also explained in a concise format to readers. The infographic describes the concept in an easy-to-understand format.
SEO Factors
Optimal rankings are achieved by applying certain techniques. These techniques are listed in a concise format to inform readers of the important components of SEO ranking. Factors such as keyword density, search engine friendly URLs, blogs and RSS feeds are mentioned to help business owners develop their website properly.
Infographics are ideal for communicating important data in a concise efficient manner. This communication tool is recommended for meetings or busy professionals pressed for time. Consider infographics for website communication or business meetings.

Paid Customer Reviews on Google – No, You Don’t Want Them

For every local business, a Google Plus Local listing is the first step in their efforts to optimize their website for local search; however, just listing your business isn’t enough anymore, because your competitors have also learnt that they should do it – the name of the game now is how to optimize your listing in order to rank higher than them.
That’s where the reviews come in. All of your clients and customers have the opportunity to write about the experience they had with your product or service you provided, and that carries a lot of weight in the eyes of Google: more customer reviews mean that your business is popular in the “real” world. Besides, good reviews on Google+ (and anywhere else on the Internet) can mean the difference between a customer choosing you, or the competitor. That’s why today there are many services that provide fake reviews – either positive reviews for your business, or negative for the competitor’s; and many people are willing to pay for this kind of service, not knowing that it may cause a serious problem to their business.
What’s the problem with paid reviews?
The obvious problem is that it’s against Google’s policy. The whole purpose of leaving a review is to help others decide if your business is right for them, and Google are working very hard to prevent any kind of manipulation with their results, including these. They can’t afford to recommend local businesses that provide bad service, because if they did, people would just stop using them as a relevant factor. So, if you get caught manipulating reviews, your page on Google+ Local will be suspended and your business banned, and all the other hard work you’ve put in your online reputation can easily be ruined.
Phony reviews done by paid services are actually really easy to detect, even when you don’t have Google’s infrastructure. Companies that provide this kind of service usually have a certain number of fake accounts and use the same “users” to review business pages; every user can see the profile of each reviewer, so if something smells funny you can just take a look at a few user profiles, and see if they’re “professional reviewers”: what are the chances for a real person to fix a mobile in one state, have a dentist in another one, a plumber in third and a carpet cleaner in fourth? And that they’re either extremely satisfied, or extremely unhappy with the service (positive reviews for client’s website, negative for the competitor’s)? If you’re willing to dig in deeper, you’ll probably find that same fake users are reviewing same business pages – with same opinion, of course.
Another form of paid review is when a business is offering an incentive, in forms of money or products, in exchange for positive reviews. This can be done in a variety of ways and is generally more difficult to detect (because it’s often done offline); this is also against Google’s terms, even though now we have real people giving feedback. On the other hand, you can encourage your customers to review your business, but there’s a very thin line between encouraging them through, say, contests, and bribing them.
However, this can be a powerful strategy to attract them to leave a review, just be very, very careful when defining your approach: you are allowed to encourage them to leave a review, but not the positive review. You can end up having some not so great reviews on your business page, but that’s OK too, because:
a) right now Google seem to be looking only at the number of reviews and not their quality when it comes to deciding on how to rank your page;
b) having all great reviews isn’t natural: no matter how hard you’re trying to satisfy all of your customers, there’s always someone who didn’t like one aspect of doing business with you, so having occasional bad or average reviews can actually help you get more credentials with the potential customers.
We haven’t even started on the ethical questions of having paid and phony reviews, and how faking them makes the reviews in general less credible. In all, it’s always better to do things the right way. It’s a good thing that you’re aware of the importance of having your business reviewed in Google, but you surely have satisfied clients whom you can ask for a legitimate feedback, so start with them and work from there – it’s much more worth it in the long run.
As a Google+ early adopter, Jeff Gross got into reviewing local businesses as soon as it rolled out. He has seen his share of both genuine and the so-called “fake” or paid reviews from non-customers. While content crowdsourcing, such as encouraged commenting on www.Serijskiubojica.hr, is a legitimate method of improving your website, artificial reviews are both ignored and dangerous.

Free EBook Download about “Make Money Blogging” by Daniel scocco

Blogging is a popular way to share own knowledge with others and generate residual income. If you are planning to start blogging, you have to know the basic as well as advance process of blogging. You can get the blogging resource from other existing blog or any hard copy eBooks. But you won’t get all resource in one place. You have to search and find the best tools and resource. To solve this little problem, I want to give you a free eBook which contains all about “make money Blogging”. I didn’t write the eBook. The eBooks was written by Daniel Scocco (www.Dailyblogtips.com). He is a successful blogger and internet marketer. The full detail about the eBook has been given below:
Screen shot of the book:
Make Money Blogging
Contents of the Book:
  • What is blogging?
  • Tools needed to start blogging?
  • What software should use?
  • Is it really possible to make money blogging?
  • What should blog about?
  • How to write good headline?
  • Posting frequency.
  • How to design a blog?
  • Importance of logo?
  • Subscription tutorial.
  • Networking tutorial.
  • How to promote blog?
  • How to monetize blog? Etc.
Author: Daniel scocco
Website: www.dailyblogtips.com
Size: 1.31 MB
Page: 54
Chapters: 5
Price: worth 47$, Now 100% free
Download:
Link 1
Link 2

How to Create a Classified Ads Website and Make Money With It?

As the blogger’s point of view, when the word “website or blog” comes in our mind, we think some necessary contents and huge visitors and more money. A website can be a good source of income. Some people earn thousand dollars from net with a website. It is a dream for every webmaster to own a money making website.
Today I am going to give you some tips to create a classified ads website and make money with it. Classified ads site is the site which in which various types of ads are posted on different categories. It is proved that classified ads site is the effective way to gain traffic and more affiliate leads. The popular classified ads website is craigslist.org.
Craigslist is well established and firs choice classified site to everyone. You can create the competitor to craigslist. The process is very easy to make a classified site.
Tools need to start:
How to get started?
By putting proper installation process of above resources your site will ready for post ads. Post some test ads on your site. Then promote your site on internet and get traffic. When visitor come and post your site, your site will ready as a money making machine.
How to make money with it?
Your site is ready and visitors puts ads on it regularly and your domain name is very popular. Now you want to make money from it. But you don’t know from where the money will come? There are various methods to make money form a classified ads site:
  • Paid membership.
  • Paid ads/feature ads.
  • Banner advertising.
  • Affiliate Marketing.
Benefits to members:
Your members are mainly buyers and sellers. From your classified site buyers will get their necessary products or services and seller will find their potential buyer to sell their product. In short, your site helps to engage buyer and seller together one place.
Bonus tip:
If you plan to start your classified site stand alone then pick a mini or micro niche. Niche business is very profitable. I hope you already know it. The operation process is very easy of a niche site than a multi category site. If you have enough manpower and resources, then start a multi category classified site like craigslist.
I hope this post will help you to start a new classified site with little technical knowledge. If you need any help about classified site making, don’t hesitate. Contact me quickly. I will try my best to help you.

SEO Tips!

First and the foremost tool in seo tips is you have to know the current rank of your website. There are tools like Google toolbar or Alexa that helps you to know the ranking of your site.
One of the best in seo tips is keyword analysis and it’s placing. Always focus the right keywords. From all other seo tips , this tip encourages you to select keyword according to your products or services. The better you line up your keywords, the higher the chance of getting targeted traffic. You have to place keywords in your content, coding, images or URLs. But you should use keywords within the limits of keyword density i.e. 3-5% and the keywords are far apart (40-100 words apart is fine). If you use more than that than Google spiders consider it spam and the content reader also will get clue that the content is only used for advertisement purpose.
Keyword positioning is essential part of seo tips place it in following heads:
Placing keywords in Title tag is very important part of seo tips . Title should be small and concise. 70 characters are enough for Google. Keywords should be place in organic way. Repetitions always are not highly recommended when web pages come on SERP.
Meta description is another important part of on-page seo tips . Google only consider approximately 150 characters from Meta description. So it’s imperative to write a unique and attractive Meta description within 150 characters using your keywords.
Images make pages and articles striking to visitors and very important section of seo tips . More over images are also indexed different search engines. Using relevant keywords in the image alt tags help in SERP.
Another powerful and worthy part of seo tips is the relationship between back links and search engine optimization. If you have more quality back links than you are more popular in the search engine. That’s why you cannot overrule this part of SEO tips. Most popular ways of acquiring these back links are blogs and forum posting that are related to the business. Linking on social media networking facebook, twitter etc, and also through article and web submissions.
Sitemap is another very important tool and the part of SEO tips; build a good sitemap for your site so that it will take only some clicks to reach to a particular web page. And search engine crawler will easily search your site. This is one of the friendly seo tips for your site. Try to use keyword in URL to make your web page SEO friendly.
As you know world has become a global village, that’s why every web master and business person have to be proactive in order to get maximum benefit from this vast market to sell their products and services. For that they have to be aware with seo tips, and should get knowledge how to apply these precious seo tips. Seo tips are now as important asset for a business as capital is. This is also one easy way to get maximum benefit from this booming virtual world. Seo tips4all not only provide the in-depth knowledge of these seo tips but also keep you updated with the changing seo tips.

Monday, June 4, 2012

Search engine optimization

Search engine optimization (SEO) is the process of improving the visibility of a website or a web page in search engines' "natural," or un-paid ("organic" or "algorithmic"), search results. In general, the earlier (or higher ranked on the search results page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users. SEO may target different kinds of search, including image search, local search, video search, academic search,[1] news search and industry-specific vertical search engines.
As an Internet marketing strategy, SEO considers how search engines work, what people search for, the actual search terms or keywords typed into search engines and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content and HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic.
The acronym "SEOs" can refer to "search engine optimizers," a term adopted by an industry of consultants who carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a stand-alone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML source code of a site and site content, SEO tactics may be incorporated into website development and design. The term "search engine friendly" may be used to describe website designs, menus, content management systems, images, videos, shopping carts, and other elements that have been optimized for the purpose of search engine exposure.

History

Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed to do was to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[2] The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as an indexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words, and all links the page contains, which are then placed into a scheduler for crawling at a later date.
Site owners started to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997.[3] The first documented use of the term Search Engine Optimization was John Audette and his company Multimedia Marketing Group as documented by a web page from the MMG site from August, 1997.[4]
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using meta data to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[5][unreliable source?] Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines.[6]
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, allowing those results to be false would turn users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. Graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub," a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[7] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random surfer.
Page and Brin founded Google in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[8] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[9]
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. Google says it ranks sites using more than 200 different signals.[10] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. SEO service providers, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen, have studied different approaches to search engine optimization, and have published their opinions in online forums and blogs.[11][12] SEO practitioners may also study patents held by various search engines to gain insight into the algorithms.[13]
In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[14] In 2008, Bruce Clay said that "ranking is dead" because of personalized search. It would become meaningless to discuss how a website ranked, because its rank would potentially be different for each user and each search.[15]
In 2007, Google announced a campaign against paid links that transfer PageRank.[16] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, in order to prevent SEO service providers from using nofollow for PageRank sculpting.[17] As a result of this change the usage of nofollow leads to evaporation of pagerank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript. [18]
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[19]
Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[20]
In February 2011, Google announced the "Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice, however Google implemented a new system which punishes sites whose content is not unique. [21]

Relationship with search engines

Yahoo and Google offices
By 1997, search engines recognized that webmasters were making efforts to rank well in their search engines, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.[22]
Due to the high marketing value of targeted search results, there is potential for an adversarial relationship between search engines and SEO service providers. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web,[23] was created to discuss and minimize the damaging effects of aggressive web content providers.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[24] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[25] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[26]
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, chats, and seminars. Major search engines provide information and guidelines to help with site optimization.[27][28] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[29] Bing Toolbox provides a way from webmasters to submit a sitemap and web feeds, allowing users to determine the crawl rate, and how many pages have been indexed by their search engine.

Methods

Suppose each circle is a website, and an arrow is a link from one website to another, such that a user can click on a link within, say, website F to go to website B, but not vice versa. Search engines begin by assuming that each website has an equal chance of being chosen by a user. Next, crawlers examine which websites link to which other websites and guess that websites with more incoming links contain valuable information that users want.
Search engines uses complex mathematical algorithms to guess which websites a user seeks, based in part on examination of how websites link to each other. Since website B is the recipient of numerous inbound links, B ranks highly in a web search, and will come up early in a web search. Further, since B is popular, and has an outbound link to C, C ranks highly too.

Getting indexed

The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. Some search engines, notably Yahoo!, operate a paid submission service that guarantee crawling for either a set fee or cost per click.[30] Such programs usually guarantee inclusion in the database, but do not guarantee specific ranking within the search results.[31] Two major directories, the Yahoo Directory and the Open Directory Project both require manual submission and human editorial review.[32] Google offers Google Webmaster Tools, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links.[33]
Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by the search engines. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.[34]

Preventing crawling

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed, and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[35]

Increasing prominence

A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to most important pages may improve its visibility.[36] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[36] Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's meta data, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL normalization of web pages accessible via multiple urls, using the "canonical" meta tag[37] or via 301 redirects can help make sure links to different versions of the url all count towards the page's link popularity score.

White hat versus black hat

SEO techniques can be classified into two broad categories: techniques that search engines recommend as part of good design, and those techniques of which search engines do not approve. The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[38] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[39]
An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[27][28][40] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the spiders, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[41] although the two are not identical.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking.
Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[42] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.[43]

As a marketing strategy

SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, depending on the site operator's goals.[44] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[45]
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[46] Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes - almost 1.5 per day.[47] It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic.[48] Seomoz.org has suggested that "search marketers, in a twist of irony, receive a very small share of their traffic from search engines." Instead, their main sources of traffic are links from other websites.[49]

International markets

Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[50] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[51] As of 2006, Google had an 85-90% market share in Germany.[52] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[52] As of June 2008, the marketshare of Google in the UK was close to 90% according to Hitwise.[53] That market share is achieved in a number of countries.
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable markets where this is the case are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language.[52]

Legal precedents

On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[54][55]
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. Kinderstart's website was removed from Google's index prior to the lawsuit and the amount of traffic to the site dropped by 70%. On March 16, 2007 the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[56][57]

Sunday, June 3, 2012

Comics Make Great Link Bait

If you're new here, you may want to subscribe to my RSS feed. Thanks for visiting!
One of our clients, Orthopaedic & Spine Center, posted a comic we created for them on their blog. While being funny and a nice 5 second distraction for your day, this comic also offers the potential for outside sites to link to them. In other words, this comic and comics in general make great link bait. You may wish to entertain this idea of comedy and SEO for your clients or maybe you already have. We would love to see other examples of comics and link building you have used or you have found being used. Add any good finds to the comments below.
We did something similar when we created the first ever monthly SEO Comic a few years ago.
Here is the Bad to the Bone comic from Orthopaedic & Spine Center. (Love the comic name.) And here is the comic itself.
Bad to the Bone Comic