Posted By Shabir MS on : August 2nd, 2015
So you’ve been spending an incredible amount of time and effort on link-building and you are still seeing no improvement in your rankings. You begin to think it is as an exercise in futility, and contemplate about killing it altogether. Our advice is to pause, take a deep breath, and a giant step backward to look at the big picture. In other words, do a thorough website audit to examine link quality.
Getting an outside (different) SEO perspective may change the way you see your link building strategies. Links are still vital to SEO, despite your seemingly fruitless acquisition efforts. It might be that your process needs a revamp or a complete overhaul with the help of an expert team. The following are likely reasons why your link building efforts are falling short:
- You are nurturing all the wrong links. Wrong links can mean many different things, like those coming from the same places you usually get them; links from less trustworthy and less authoritative places; those from sources that potentially look manipulative; and links from irrelevant pages. These types of links don’t necessarily add value to your optimization and may not be enough to convince Google to move you up their rankings. Link location and uniqueness also play a role in determining the relevance/value of a particular editorial vote. The remedy? Push for in-content links and improve link acquisition through relevant, trustworthy, and authoritative sites, unique to what you are always getting.
- Wrong domain. This is probably one of the most frustrating link building issues. People’s perception of your domain plays a role in their behavior and attitude toward your content in the search result. This is why it is important to build brand affinity between search topics and keywords, as well as what’s in the minds of your searchers.
- Accessibility issues. Technical glitches can also be real link busters. Auditing your site for dead links or pages that show error is important if you want to enhance your link building campaign.
Posted By Shabir MS on : July 30th, 2015
The core principles of building and maintaining links remain intact, but over the years, significant changes in the signals that determine the value of a link can be observed. The most recent of these major changes rolled out in the second quarter of 2012 when Google introduced it’s Penguin update. Since this algorithm change, few, but significant changes occurred, ushering in the age of user data-driven results and machine learning.
The old mindset that the more links your website has, the higher it will rank is no longer valid. Webmasters in the past may have seen success in acquiring as many links as they can; regardless of their quality; for the sake of a short-lived spike in rankings. Today, however, this short-term approach simply cannot fly with the new standards. Business owners and webmasters, along with SEO specialists, now know that it takes so much more than link quantity to merit link-building strength.
Good content marketing is the way to good quality links, which is why developing content assets that your target audience cares about should be a major priority. This will encourage them to share your content and link to it from somewhere credible; somewhere authoritative; somewhere relevant. Generating link-worthy content that your visitors care enough to share organically, and other bloggers/websites deem valuable is a more efficient way to build and acquire quality links than any other tactic.
Today’s link building strategies must be grounded on major goals such as: creating a large enough audience who will enable you to receive traffic to brand new content. And you can build strong relationships with credible, trustworthy, and relevant websites. Google and other search engines are condemning scalable and artificial link-building. If you are still stuck in your old ways, you are bound to lose not only the quality and value of your links, but also your rankings; and eventually, your traffic.
Posted By Shabir MS on : July 27th, 2015
Will your gateway pages set you up for SEO ruin? Most likely, yes. With ranking adjustment algorithms specifically designed to address doorway pages on their way, it is high time to follow Google’s advice in dealing with multiple sites and pages leading users to one similar location.
To give you a working definition, doorways are pages or sites that have been specifically created to rank highly for searches. What’s wrong about them is that they lead to multiple, yet similar pages in search results and end up taking users to the same destination; worse yet, they lead to intermediate pages that may not be as useful to the searcher as the final destination.
Examples of doorway pages include multiple domain names/pages targeted at specific locations (such as regions or cities), and then funneling users to a single page; pages deliberately generated to take users to actual usable content or relevant portions of a website; and substantially similar pages, closer to search results than a browseable hierarchy.
Since there are still no "broad impact" evidences indicating that the gateway update has already been released, you may still have time to augment and completely avoid its effects by:
- Taking down your empty pages and refraining from pushing new pages or making them live until unique content is actually available;
- Cleaning up site navigation so human visitors can more easily find their way around your website. This means removing pages solely designed to drive site traffic, but are loosely integrated into your navigation;
- Avoiding duplicate content;
- Holding off deployment of pages that rely on staff or customer generated-content until they have enough legitimate activity to justify them; and
- Avoiding multiple sites, if possible, or ensuring that separate domains can be justified, in that they offer dramatically different products/services. Similar links to the same contact form for different domains are probably risky.
Posted By Shabir MS on : July 24th, 2015
Overturning an unnatural links penalty can be difficult. There’s an odd mixture of fear and excitement that comes with waiting for Google’s response, after filing for your first reconsideration. "Manual Spam Action Revoked" is definitely the answer to root for, but many are doomed with the dreadful "Links to your site violate Google’s quality guidelines," (still) which only means more work and ground to cover in order to lift the penalty.
Unfortunately for those who can’t seem to get out of the link-spam rut, the search engine giant doesn’t give further direction or explanation to help you understand why the request for reconsideration was denied. You are lucky if you are given examples of unnatural links to take care of, but this is not always the case. Sometimes, you will receive no explanation as to why your request failed. Based on best practices and standards offered by Google, here are likely reasons why your consideration request didn’t come through for your website:
- You did not address enough unnatural links. A careful audit of your site is a must, to make sure that you can remove as many unnatural, unnecessary, and substandard/bad links from your profile as possible. It’s not enough for you to go only after the worst links. What Google wants to see is you, identifying and recognizing close to 100% of your unnatural links.
- A non-compelling reconsideration request. When writing your request, don’t write too long. But explain enough to convince the company’s web spam team that you’ve been working hard to remove as many natural links as you can, and will continue to work on the quality of links to your site, moving forward.
- Wrong disavow file format. In the past, there is no telling whether the disavow file you have submitted would work or fail due to a syntax error. Today, however, an error message cautions you so you can fix problems before re-submitting.
Posted By Shabir MS on : July 21st, 2015
The importance of on-topic and off-topic links has always been a big issue among website owners and marketers—all of whom want to know whether it matters that they have more of the other or not. By on-topic links, people usually mean those coming from pages and sites that talk about the same or at least a very similar subject matter as their own. With all the talk of link quality being more important than link quantity, it makes sense why people would think that Google somewhat cares about the degree of on-topic or off-topic relevance links have.
SEOs will argue passionately about one or the other, based on their different experiences. Some testify that on-topic links are significantly more important in SEO than off-topic ones, while others see more positive results from the latter. What these anecdotal evidences tell us is that either way Google sees it, and whether or not it has some idea of context. It is also said that Google looks at the relationship of content between the liked pages, both on-topic and off-topic links possess the power to influence searches to some extent.
Link context is an important criterion to look at, if you are engaging in link-building strategies that could potentially be seen or tagged as manipulative. When in doubt about the appropriateness of putting a link out there, you can always go back to your on-topic standards. However, before obsessing over which type of link is more valuable, ask yourself if it is even realistic for you to pursue and get those links. It all boils down to what search engines care about the most, which is to match content relevance to the search query, while measuring link popularity. They also care about brand and topical authority as well as domain authority, along with other things like user data, engagement, and anchor text—all of which form part of a bigger puzzle than link context.
Posted By Shabir MS on : July 18th, 2015
E-commerce sites are the lifeblood of many businesses, particularly those that operate solely online. Anyone planning to sell products or expand their services through the internet should know that a website is of little use if people don’t know about it. This is why one major goal of online marketing should be making sure that your target audiences can find you via the web.
Search engine optimization for e-commerce sites does not stray far from traditional SEO efforts. Here are some tactics you can adopt and follow to help popularize your site and create a stronger web presence for your business:
- Invest on effective keywords.
Search engines rely greatly on keywords to rank relevance. Finding the right keywords to build your campaign upon is critical to the success of your optimization efforts. Strategically placing well-researched keywords and phrases will improve the effectiveness of your website, in terms of attracting the right people into your doorsteps. Use keywords that are specific to your niche and take advantage of long-tail keywords to catch specific types of traffic that lead to higher conversion rates.
- Ditch duplicate content.
E-commerce sites are prone to duplicate content due to carrying products that are essentially the same. Be careful when formulating product descriptions so as to avoid redundancy. Make sure that all content is original and not similar sounding because duplicate content can send your rankings down. Just the same, don’t forget to write catchy descriptions that will make already captured audiences want to take action.
- Optimize product images.
In addition to using good quality photographs, make it a habit to optimize images by making them easier for search bots to see and understand. This can be done by adding proper and unique ALT tags, associated with the product/products pictured.
- Fix broken links.
One common problem with e-commerce sites is the abundance of broken links. These glitches are sure traffic killers, because they make consumers lose interest. It is for this reason that you should always keep tabs on all your links so you avoid sending visitors into limbo (leave them halfway). There are many different tools you can use to check and fix these broken links.
Posted By Shabir MS on : July 15th, 2015
Change is the nature of SEO and constant evolution of techniques and best practices is something that marketers must learn to cope with on a regular basis. As search engines like Google strive to improve the way websites are ranked in searches, so must website owners continue to adjust their strategies in a way that complies to best practices. Over the years, Google alone has made at least 13 updates to their algorithm, some more notable and public than others. Google’s Penguin update has already taken its toll on over-optimized websites, but what does the Panda update imply? Getting up to date with these algorithm changes is crucial for any website, especially one that relies much on organic traffic for new business. This also rings true when it comes to optimizing websites for other major search engines. Here are notable changes to watch out for in SEO:
- Mobile optimization has been practiced since the advent of web-enabled mobile devices, but more focus should be put into creating and optimizing website content for mobile traffic, especially with valid projections that mobile traffic would exceed desktop-generated ones by 2015. Google has always been firm in saying that responsive websites offer the best user experience—something they give favor to when ranking for searches. Recently, the engine has started including "mobile-friendly" notations next to sites that are truly mobile friendly in mobile searches.
- While Google rules search engine market shares, 2015 is projected to be the year that other engines would step up and begin to take more of their fair market share. With more browsers taking in other engines as default search tools, it wouldn’t be surprising if users start to divide and disperse. With search options other than Google becoming more accepted and used, it is important for websites to gain visibility across these engines.
Posted By Shabir MS on : July 12th, 2015
User-generated content is often neglected as an SEO strategy simply because not many recognize its value. With so many factors affecting search rankings, website owners are easily overwhelmed with the many things they need to optimize, giving them a much harder time determining the best website elements to leverage.
If you follow algorithm updates religiously, it is easier to determine which techniques to focus on and which strategies you should abandon or use more sparingly in order to help your pages rank better. One of the most recent algorithm updates introduced by Google, for instance, focuses more on conversational searches, thus encouraging website owners to generate and produce more original content through genuine authorship, social channels, and user-generated content. All these help identify users or websites as authentic resources for credible information.
Because SEO is all about relevancy, generating relevance and authenticity in the right places matters, and this should be at the forefront of your optimization strategy if you want to stay ahead of the game. User-generated content is the newest way to optimize websites for searches. Encouraging consumers to speak more about your company or brand offers a two-fold advantage, as you increase brand awareness and enhance your rankings in search results. Your new SEO strategies should therefore focus on improving your customer service and making your business more social.
Just the same, users are not SEO-experts, which means it is your job to educate them on optimization. This doesn’t necessarily mean begging them to use your keywords on their reviews and comments, but rather, making a habit of using them in your copies so that consumers will be encouraged to use them organically. There are many creative ways to make users generate relevant and keyword-rich content for your business, ranging from creating unique hash tags with keywords when sharing brand experiences, to running contests, and promoting user-generated content on your social ads.
Posted By Shabir MS on : July 9th, 2015
More links = more search traffic = more customers – this is one of the simplest and most important equations you need to remember when doing SEO. Links have a direct correlation to how much traffic a website receives as well as the quality of such visits. In fact, when Google first started ranking websites in searches, their algorithm gave much weight on the number of links that pages had. SEO, therefore, operated with the basic concept that the more links a webpage has, the more traffic it can generate. During this time, the number of links a page has is indicative of the quality of content it offers. Links used to serve as popularity votes and the more your website gets, the more popular you become with Google.
Today however, much has changed in the way search engines, especially Google view links. Because of the fact that links can now be easily (and artificially) generated and manipulated, their value greatly diminished. In fact, sites can be penalized by having too many unauthorized, bad, dead, or low-quality links in their profiles. The old equation where more links = more search traffic = more customers, therefore, should be modified to fit today’s standards where certain types of links hold greater value and others can serve as bad seed that can pull rankings down.
A new equation where quality (organic, relevant) links – bad links = more relevant traffic = more converting customers, should therefore be used instead of treating all links equal. Depending on what your website offers, some links are more valuable than others as they help build relevance, trust, authority, and diversity. While links have lost some of their value over several years of search algorithm updates, they remain an important factor in determining website relevance, credibility, and authority in searches.
Posted By Shabir MS on : July 7th, 2015
Ecommerce sites most commonly use geo-targeting, an optimization technique that helps them serve different content to audiences in different countries. In an effort to reduce challenges these sites face when it comes to organic searches, Google continues to make improvements in the way they crawl these geo-targeted sites. These changes enable more multinational sites to rank better in search results.
Geo-targeting is an accepted website feature and strategy that works to improve customer experience. Unfortunately, some of the techniques and features of its implementation can sometimes prevent and hinder search engines from crawling and indexing site content holistically. In an effort to resolve crawling and indexing issues, Google created new and improved locale-aware crawling configurations for Googlebots, which are automatically activated when Google detects language/locale-based content, algorithmically. These configurations include language-dependent crawling, through which search bots can start crawling pages with "Accept-Language" HTTP header in the request, and geo-distributed crawling, through which bots can recognize IP addresses that appear to be coming outside of the US in addition to those that they currently use. These new configurations are deployed automatically as Google’s algorithm detects the need for more thorough indexing.
These improvements in Google’s crawling configurations are designed to be deployed automatically, which means no notifications or changes are required on your part. However, they do not eliminate the need to optimize your site locally. While this is a step towards improving geo-targeting issues, search engines still have a long way to go in optimizing and fine-tuning algorithms and configurations to support geo-targeting better. Google, for one, still strongly recommends creating separate URLs and rel-alternate-hreflang annotations for different locales.
The other side of this is, there’s still no way to request Google to engage its locale-aware crawling configurations on your website and still no way to determine if it is already being used on your site.