#
Free SEO Analysis
Name
Email
Phone
How did you learnt about SEOValley *
Budget
Website
Service

 

Google Webmaster Update: Blocking JavaScript & CSS Can Affect Indexing

Posted By on : February 23rd, 2015

Google Webmaster Update: Blocking JavaScript & CSS Can Affect Indexing

A new update in Google’s webmaster guidelines has website administrators reaching for the unblock commands on CSS and JavaScript indexing. According to the Google Webmaster Central Blog, Google’s latest indexing system update helps it function better as a modern browser that includes active JavaScript and CSS. Their advice on allowing their Googlebots to access the CSS, JavaScript and image files of a website is explicit, saying that activating them will provide optimal indexing and rendering of websites. Conversely, disallowing crawling of CSS or JavaScript files in your site can harm how well their new algorithms index and render content, which can in turn result in suboptimal placements and rankings.

The upgraded system will require procedural changes for webmasters and website administrators. According to Google, users should no longer regard their indexing algorithm as a test-only system. As Google enters a new phase of development, webmasters should be advised that their rendering engine may not support all technologies and that website designs must adhere to stated progressive enhancement principles, thus ensuring engines can scan usable and supported content. Google also reminds webmasters that the speed to load a page is important for indexing and for users. Webmasters should also be reminded that servers must enable support that serves CSS and JavaScript files to Googlebots.

Furthermore, Google has also updated its Fetch diagnostic tool, which enables webmasters to simulate how the search engine crawls URLs on a website. In the past, Google’s Fetch mode only crawled URLs that correspond to a specific path requested. When successful, the crawled URL can be reviewed to check for and debug connectivity and security issues. The update to this feature is known as the Fetch and Render mode, which tells Googlebots to crawl and display pages as browsers would to your audiences.

 

Link Removal Request Techniques that Actually Work

Posted By on : February 18th, 2015

Bad links endanger your web ranking, especially in search engine results pages. One bad link can lower your search placements down the ladder, which can further lead to lost clients, reduced rankings and low conversion rates. Anyone with some knowledge on SEO, will be aware of the importance of backlinks. However, with the spread of the usefulness of search engines, strive to provide better service to their users have certainly changed. The process involves the way to evaluate links and updating web search are also shifting, putting lesser value on certain types of links, and even flagging some as irrelevant and toxic.

The broadening use of internet has improved the quality of link. It is turned out to be more important than ever, thus many businesses have been focusing on removing bad links that may cause trouble with search engines and on creating high quality link partnerships relevant to their niche. If your website has recently experienced a sudden decrease in rankings and in visitor numbers, see if you can trace the problem to your backlinks. Toxic links can pull your search engine ranking down and make you invisible, therefore there is inaccessible to your target audience. If you have discovered several harmful links during your SEO audit, best request for a link removal from the offending partner. Here are effective ways to send removal requests:

  • Use a legitimate and reliable email address to send out your removal requests if you don’t want your initiative to be ignored. Use an email address that includes the right domain so your recipients will know what they are dealing with immediately.
  • Find a reliable link disavow tool. Take Google’s advice and ask the search engine directly not to take low-quality links into account as they assess your site. However, you should still make the effort to clean your link profile and rid it of unnatural links that point to your site. Simple disavowal might not be enough to regain your standing.
  • Send requests that state the specifics of what’s going on, but remember to keep your explanation short and concise so they will understand why you are making the request.
  • Don’t be threatening as you request for link removal, as this can only make things worse for your site.

 

How to Rank Well in Local Search

Posted By on : February 16th, 2015

Constant changes can be expected from SEO standards, especially with the continuous improvements implemented by search engine giants like Google make to help improve their service to users. From algorithm changes to the growing importance of social signals, encrypted searches, and other SEO tweaks, search marketers and website administrators have a lot to keep up with.

While search engines do provide general guidelines to help websites maintain best practices, it is hard to figure out where to start to pull up rankings and get the results that you want, especially when shooting for good local SEO results. If it is local SEO you are most concerned about, your Google Places page is the best place to start. This Google leg lists down hundreds of thousands of local businesses across the globe, helping searchers and consumers make more informed decisions when it comes to their purchases and service subscriptions. As consumers search for specific search terms, Google place pages that are relevant to the search query appear in the results. If you wish to maximize the use of local SEO for your website, there are ways by which you can ensure the effectiveness of your local SEO efforts:

  • Verify your page – The first step to gaining control of your Google Places listing is to claim your business page. Once you have made your location page, make sure to verify the listing and make additional improvements.
  • Make efficient category associations – Once you get Google to verify your page, you also get the opportunity to make business category associations. Google Places will provide you with as many as 10 categories (1 primary and nine other relevant categories) to which your business can be associated during searches.
  • Be consistent with your business information – Make sure your contact details (name, address, phone number, etc.) are consistent in all web pages and links that contain them to avoid discrepancies.

Encourage people to make reviews about your business – The reviews you receive also play a part in your overall search optimization success.

 

How to Optimize Multi Location Pages for Local SEO

Posted By on : February 13th, 2015

Local optimization is challenging because the scope of being found on a particular region is narrow, on search engines. The test is fierce when you are trying to optimize for a company with multiple locations. Generating local SEO for multi-location business is more difficult than ordinary SEO. It only requires more organized way of planning based on the geographic priorities and business goals of the company. The following are simple ways to improve local optimization for your multi-location business:

  • Do a location-based SEO audit – The best way to start rehashing your local SEO approach is to know your position in your campaign. A basic location based SEO audit will allow you to determine which locations have the most room for improvements.
  • Determine geographical priorities –Ideally, you want to rank well for each of your locations, but if your resources and time are limited, it is important that you prioritize your efforts. The best way to do this is to determine the demand for your products and services in each of your locations. You can use Google’s Keyword planner to determine keyword competition and demand in your target locations.
  • Create a unique page for each location – One of the most important elements of local SEO success is having a unique page for each business location. If you have 10 different locations in 10 different cities, each individual location must have its own page on the website and must be uniquely created and optimized to rank for the services it offers and the location it serves. Ideally, your business locations must have their own location-specific social media account, a background information about the location, employee names/contact persons, and location-specific promotions.
  • Create a unique profile on Google Places for each of your locations – Complete separate profiles for each business location on Google Places for Business to increase your local rankings.

 

Few SEO Considerations When Building a New Website

Posted By on : February 11th, 2015

Search optimization should never be a secondary task. Whenever building a new website or updating an old one, keep search optimization in mind in order to meet new standards and rising consumer demands. Investing in SEO advice or a proper service from the get go or hiring a web developer aware of SEO techniques will help ensure the success of your website in terms of reaching your web goals. The following are some of the most important things that you should consider when building a well optimized website:

  • Use simple coding. With different coding languages and standards, there is an infinite ways to create a website and achieve desired results. However, this does not mean that they are all equally good. When designing a layout for a website be careful, and avoid code bloat, which is proven to be a danger for your website. Make sure to avoid unnecessary code, this will make end-user friendly, and also abide by all SEO standards.
  • Consider blogging. With the rave about content, it is definitely important for your website to generate new content whenever possible. An integrated blog not only helps you generate fresh content, but also helps build your authority and makes your brand more personable and approachable.
  • Consistency is the key. Be consistent in the way you style and code content and pages. Consistency across the website will help establish your brand and will make it easier to maintain the website. It also reduces the chances of things looking weird, misplaced or breaking the flow, especially when viewed across different platforms.
  • Guarantee speed. The speed at which your site can be browsed creates a great impact. Firstly, the user gets highly impressed if the site can be browsed faster. Often the speed of your site allows other cliché go un-noticed. Secondly, it plays a dominant role in ranking the site in terms of search optimization techniques. There are ways to avoid stuffing your site with slow-loading elements and maintain speed., such as keeping your code simple, choosing a good host country, minimizing JS, CSS, and HTML files, and avoiding heavy images and similar types of elements.

  • Keep responsive design. Having a responsive design is crucial, especially for searches done on mobile devices. The responsiveness of a website can directly impact its user experience, rankings, and traffic.

 

Dealing with Onsite Duplicate Content Issues

Posted By on : February 9th, 2015

Having duplicate content may seem harmless to your site, but it definitely causes much obstacles to your search optimization efforts. The best way to illustrate why having duplicate content in your site is harmful is by pointing out the importance of unique content. Unique website content is what sets your website apart and what makes you different. When the same text is found in your pages, for instance, similar word-for-word data on products, presented in multiple URLs, you gain nothing.  The same principle applies for when using the same information or text to describe what you do or the products that you sell as the next website.

Duplicity happens when more than one version of a certain page is indexed by a search engine, and this can be detrimental to your website’s rankings, as it makes it harder for search engines to determine which page to use and rank. To remedy duplicity of content within your site, there are certain things that you can do:   

  • •     Redirect them – You can redirect duplicate content by setting up a 301 redirect from the duplicated page to the original content.
    301

  • •     Canonical tag – when you use ‘rel=canonical’ tag it dictates which version of the page you want to be indexed and get located in the search results page.
  • •     Meta tags – Meta tags can also be indicate search engine bots which pages you want and do not want to index.
  • •     Syndication – Syndicating content should be done carefully to make sure that they don’t cause duplicate content issues. Ask your partner sites to use “no follow” links to avoid duplicity.
  • •     Responsive URLs – To solve duplicity issues due to mobile site versions, use a responsive or a similar URL.

 

How to Maintain Your Keyword Rankings after a Site Redesign

Posted By on : January 23rd, 2015

One of the biggest risks of doing a website redesign is losing your SEO rankings. This is why it is very important to carefully plan each step of the process and consider key elements that may affect your keyword rankings and search engine placement. Failing to take fundamental SEO principles into consideration as you plan your site re-design can cause significant dips in your traffic, which can further result to a great loss of business. A redesign should not be all about aesthetics, nor should be about following trends. It should also be about keeping with best practices and maintaining fundamental principles that will help retain good search optimization and facilitate growth.

When hiring a web consultant or an agency to help out with your site redesign, make sure that they have the right skill set and knowledge to keep your web rankings and execute a re-design that takes care of both aesthetic and technical factors. Here are ways to retain SEO rankings after doing a site redesign or launching a new website, altogether:

  • Perform a thorough SEO audit. To minimize the risk of losing your audience and slipping down on search rankings, make sure to audit your old website and identify elements that you need to keep in place in order to retain your rankings or things that you need to improve to beef the website up. Experienced agencies and web consultants have the best tools to help you analyze the state of your website accurately and quickly, and apply elements that work to your new site.

 

  • Use 301 redirects. A costly mistake when migrating your site to a newer, better domain is forgetting to redirect old pages to new URLs. Changing page locations and failing to redirect them spells a huge disaster for your search rankings because search engines will have no way of locating them. 301 redirects will help you preserve at least 90% of your link juice and ranking power.
  • Create a well-designed site architecture and make sure your on-site elements are structured with SEO in mind, from your headings to your titles, tags, meta descriptions, and keywords.

 

Why SEO Still and Always Matters

Posted By on : January 21st, 2015

Lots of debates and discussions go around whether or not SEO still has value in terms of web marketing. With newer and seemingly better and less tedious ways to gain popularity online, SEO is often viewed as old-fashioned and unnecessary. As a result, many people wonder if SEO still matters and whether or not they should invest on it for their site to rank.

SEO is very much alive and well. In fact, it is a thriving industry. It is far from being “dead” as a lot of people may think and it still and will always be an important aspect of web marketing. Search is remains to be the most common starting point for anyone looking for a product or service to find what they are looking for on the web. Besides, with the constant changes that search engines make in their ranking algorithms, it is clear that SEO will always matter. Some tactics and strategies may become obsolete and potentially harmful for a website, but there are always new ways to cope with the changing scenes. This is where you need to take a step back and evaluate what needs to be done to make your search optimization efforts more productive.

The Hummingbird is Google’s latest algorithm update, designed to make search faster and more precise. It aims to bring search closer to a more conversational level, which means results pay more attention to the reason or intent behind the search instead of simply ranking sites based on the words. It will not be long until searches can be done in a more conversational manner, influenced by different social signals.

In search, quality matters, which is why SEO will always be significant, helping websites maintain content and design structure quality, which in turn helps pages climb up the ranking ladder and become more visible to searchers.

 

Optimizing New Website for Long Tail Keywords

Posted By on : January 19th, 2015

Keyword search remains to be the basis of all search marketing, which is why proper keyword research is very important as you build upon your search optimization strategies. Nowadays, search optimizers and web marketers are bringing the focus on long-tail keywords rather than simple keyword phrases that have long taken the spotlight in terms of optimization. Long-tail keywords are basically longer keyword phrases that searchers are more likely to use or type into their search bars when looking for very specific products, services, or information. These keywords bring them closer to a point of purchase than more generic key terms. Here are some useful insights that can help you rank well for long tail keywords.

  • Know your mission and purpose. You might want to sell something, provide information, or push a really great product or service. If you know what makes your content, product, service, or website special, it is easier to make readers like and even buy your stuff. Knowing your mission or purpose leads you to knowing your niche and determining what makes your product, service, or business unique. In turn, you gain leverage as you write them down and express them in words that are understood and used by the audience.

 

  • There are some markets that are particularly hard to rank in because of the level of competition. Many small businesses suffer a great deal when trying to dominate search results, especially since they are competing with large budgeted companies that have everything it takes to launch full-on marketing campaigns. However, if your mission is clear and you are able to clearly define what makes you stand out in the market, it will be a lot easier to focus your efforts on things that make you great and target long tail keywords that are more specific to what you are offering. This also leads to better-targeted audiences and prospect clients.

 

How to Easily Debug Markup Implementation Errors

Posted By on : April 23rd, 2014

Debugging can sometimes be a tricky affair but it has now been made easier. The Structured Data Dashboard from Google Webmaster now has a new feature that helps to make debugging easier and helps users learn how Google sees marked-up data on a website.

In the feature an ‘item’ stands for a top level structured data element (excluding nested items) that is tagged in the HTML code. They are arranged according to data type and the order is by number of errors. There is also a scale for the errors themselves.

This will make it easier to make comparisons between errors and items. Any changes you make on the website can easily be traced back to the change in markup errors (whether they are appearing or disappearing).

The process

•    Click on the specific content type and you will be able to view the errors. The results can be filtered or viewed in one go.

•    You will then check whether the markup meets the implementation guidelines for each content type.

•    Click the URLs on the table provided and you should be able to see specific details of the detected markup the last time the page was crawled and the missing pieces.

•    Fixing the issues is now easy and a test can be done in the Structured Data Testing Tool. The pages are crawled again and reprocessed and the changes are ready to be reflected in the Structured Data Dashboard after implementation.

Structured Data Testing Tool

More error types will be added over time so as to improve debugging. Questions are inevitable with any new feature so for this case you can go to the forums on Webmaster to attain more knowledge on the details of this feature. You can also relate your experience to them and listen for feedback.