Google ranking

Study said: outbound links affect the ranking of sites in Google

Recently Reboot Online published the results from their research on: How the outbound links affect the position of websites in Google search results, despite contrary statements by the representatives of the search.

What is outbound links?

Outbound links are links that are meant to take you elsewhere. These are links that are going to direct you to another specific webpage or website altogether. There are two kinds of outbound links – dofollow link and nofollow links.

What did they do?

The study covers 10 new domains that have been bought. All of the 10 new domains were registered at one and the same time to eliminate the influence of the age domain factor.

On these news sites several articles have been published. Two highly recommended/respected relevant resources have been mentioned (included) in each one of these articles. As at 5 sites the resources have been mentioned as text only, and at the 5 other sites – as hyperlinks, as follow.

1) aveoningon.co.uk – does not contain outbound links
2) bistuluded.co.uk – contains outbound links
3) chotoilame.co.uk – does not contain outbound links
4) dyeatimide.co.uk – contains outbound links
5) edikatstic.co.uk – does not contain outbound links
6) foppostler.co.uk – contains outbound links
7) gamorcesed.co.uk – does not contain outbound links
8) heabasumel.co.uk – contains outbound links
9) iramebleta.co.uk – does not contain outbound links
10) jundbaramn.co.uk – contains outbound links

All of these sites were designed with a similar, but not identical structure.

Web crawlers access to the domains were blocked up for as long as the content of the sites have not been published. This stem was important to be sure that Google will indexes the content of all websites at one and the same time.

 

Once the web crawlers access have been unlocked and the sites were indexed, Reboot Online staff provided several searches for key phrases and recorded the results as screenshots. The ranking progress were monitored for 5 months.

 

Ranking results

It was not a big surprise that the presence of a outbound links to reputable sites (in the analysed websites) have a positive impact on the site ranking.

Heat map

Heat maps were created based on the ranking results, taken for one day of testing. Green squares indicate that the site held the expected position in SERP Google for the keyword. A red squares (or red gradient) denotes a site held low position. The further away it is from the mid-point, the darker red the square is.

The statistics for the keywords [Phylandocic] и [Ancludixis] have been published:

Keywords [Phylandocic] results

Phylandocic

 

Keyword [Ancludixis] resultsAncludixis

Websites ranking position

Website ranking positions for the same keywords were presented. These diagrams show the positions of sites in the Google rankings. The blue line shows the site position in the Google ranking – for sites contains outbound links. The orange line shows the site position in the Google ranking – for sites do not contain outbound links.

Keywords [Phylandocic] results

Phylandocic

 

Keyword [Ancludixis] resultsAncludixis

 

 

seo effect

Top 10 Free Tools for Optimizing a Website

It’s easy to design and develop a website, but to optimize it for the Search Engines is tough work.

If you Googling about Free Search Engine Optimization Tools, you will definitely find large humber of testing tools, so we present in the report, our top 10 of the most useful, worthful and accurate optimizing tools for webmasters.

1. Google trends

Google Trends is a public web facility of Google Search, that shows how often a particular search-term is entered relative to the total search-volume across various regions of the world, and in various languages.

Google Trends

The Google Trends uses real-time search data to help you understand your consumer search behaviors.

Using Google Trends you can compare searches between two or more terms.

Google Trends helps you to find how do people search for your brand; information about your competitors and partners.

 

2. Google AdWord Keyword Planner

Keyword Planner is a free AdWords tool that helps you build Search Network campaigns by finding keyword ideas and estimating how they may perform.

Google Keyword planner

The Keyword Planner adds help you to identify the keywords and the scope of you campaign. You can search for keywords/ideas on specific terms that describe your campaign (product, service…). You can create multiple lists of keywords that to combine. You can get traffic estimates and historical statistics of the searches.

To use it you need a Google AdWords account.Using AdWords you can reach more customers online and boost your business.

 

3. Robot.txt Tester

The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots.

robot.txt
A robots.txt file is a testing tool in search console for testing the robots.txt file. Using it you can root your site and indicates those parts of the website you don’t want accessed by search engine crawlers.

 

4. HTML/XHTML Validator

 This validator checks the markup validity of Web documents in HTML, XHTML, SMIL, MathML, etc.

HTML XHTML W3C validator
W3C validator allows you to check HTML and XHTML documents for well-formed markup. W3C validation is important for site usability and browser compatibility.

 

5. CSS Validator

The W3C CSS Validation Service is a free software that helps Web designers and Web developers check Cascading Style Sheets (CSS).

W3C CSS validator

W3C CSS Validator allows you to find errors and issues you need to be fix in your CSS file. Using it you can your CSS code and find errors or typos.

 

6. Structured Data Testing Tool

It’s a Google tool that validates and tests structured data in websites.

You can use Structured Data Testing Tool to check if Google can correctly parse  your structured data markup and display it in search result.

 

 

7. Mobile Friendly Tool

It’s a Google tool to analyze a web page and report if the page has a mobile-friendly design and gives optimization recommendations.

Google Mobile Friendly Tool

Using  Mobile Friendly Tool you can see how the Googlebot sees your page and website as a whole. You can identify any errors or issues that need to be fixed.

 

8. Page Speed Tool

The PageSpeed tools analyze and optimize your site following web best practices.

Google Page Speed Tool

Using Page Speed Tool of Google  you can analyze your site performance and to identify the how you can make your site to be faster and mobile-friendly.

 

9. Search Console

Google Search Console is a free service offered by Google that helps you monitor and maintain your site’s presence in Google Search results

Google Search Console

Using Google Search Console you can monitoring your website performance in Google search results – you can check if Google cab access your content, submit new content for crawling, monitor malware or spam and etc.

 

10. Google Analytics

Google Analytics lets you measure your advertising ROI as well as track your Flash, video, and social networking sites and applications.

Google analytics tool

You can use Google Analytics Tool to tracks your website and reports it traffic. You can generate reports and find amazing information and data about your users, content, referral, social channels, traffic sources and more.

 

6 Best Free Photoshop extensions for Web Designers

These Photoshop extensions are cute and useful for web designers.


Skeumorphism.it

Skeuomorphism.it refers to a design principle in which design cues are taken from the physical world.

skeuomorphism-it photoshop

Skeuomorphism.it is a free Photoshop extension, by Roy Barber which turns skeuomorphism design to flat design. If you find anything skeuomorphic (icon, background or website templates) that you want to turn flat, this is the plugin to get.

This simple Photoshop plugin transforms your designs in seconds. Using it you can change a skeumorphism design into a flat design.  You can use it if you have skeuomorphic website templates, icons and etc. that want to turn flat.

There are already lots of online collection of free beautiful skeuomorphic Photoshop goodies that you can use directly.

 

 

 GuideGuide

GuideGuide is a free plugin for dealing with grids in Photoshop.

guide guide photoshop

Using GuideGuide you are able to create accurate columns, rows, midpoints and baselines. The extension makes you work with grids in Photoshop extremely easy.

If you need to design a site with multiple columns and gutters the GuideGuide extension is right for you.

 

Flaticon

The largest database of free vector icons available in PNG, SVG, EPS, PSD and BASE 64 formats.

Flaticon Photoshop

You can use Flaticon plugin to find the icons you need for your design, without leaving your work enviroment. These icons are scalable, editable and accessible to any screen reader.

 

Breezy

Breezy is a free photoshop extension that export multiple layers.

Breezy Photoshop

Using Breeezy extension you will have the ability to export multiple graphic elements from your PSD. this added Photoshop functionality makes it extremely fast to prepare the graphic files for apps development, websites or flash banners.

 

   

Cut and Slice Me

Cut&Slice Me is a Photoshop extension for simplifying the process of cutting and slicing design.

Cut and Slice Photoshop

The Cut&Slide Me extension enables you to export the assets to different devices (Android, iPhone, Desktop) for designing. The exported files could have various scales and resolutions (retina for iPhone or HDPI, LDPI, MDPI and XHDPI for Android).

Another important photoshop extension is Cut and Slice Me. This plugin enables you to export the assets to different devices for designing and within a span of time.

 

 

CSS3Ps

CSS3Ps – free cloud based photoshop plugin that converts your layers to CSS3.

css3ps photoshop

CSS3Ps calculations are made in the cloud, so no need to update the plugin to use new features.  This plugin also supports multiple layers, prefixes, also SCSS and SASS for compass as well.

 

 

 

Google Resizer

Google new ‪Resizer‬ tool helps ‪designers‬ and ‪developers‬ test any URL

The new Resizer Tool of Google is an interactive viewer.

Google Resizer Tool

What does the Resizer do?

  • It helps designers and developers to see and test any URL.

  • It shows how your site looks on different screen sizes

Using the Resizer Tool you can find how different digital products respond to material design across desktop, mobile, and tablet.

As designers and developers working on UI know that their biggest challenges is to serve the right UI of any application to users on different devices. No matter how they are using the relevant application, the designers and developers are the people responsible to make digital products accessible to everyone. Actually till now there is no simple design solution to fit every needs and standards.

As many of you know Google offers Material Design guidance with amazing tips about UI patterns, surface behavior, responsive grids, colors and etc. And now we have the Resizer, that provide the designers and developers with the key tool to test the material design stick the work activities between designers, developers and owners of the required responsive UI.

Resizer allows to input any custom URL, or preview any websites or local demos, and pushing “action” button you can see how your user interface will look like on different devices (desktop or mobile). To do it just t visit Resizer and paste your website URL into the box on the upper-mid side of the page.

Useful UI Patterns and demos

Google Material Design Guideline provide lots of variations of potential patterns that can be used as a base to follow the right screen size changes and position of specific components of your UI. You can use any of these patterns in Resizer and visualize them to see which one is the best for your work. Some of these patterns can be found in the “demos” included in the Resizer in Angular (Pesto Demo or Shrine Demo) and other demos built with Polymer. You can choose from these available best practices by clicking on the address input box. While testing you can change the screen resolution and can use other many resolution options you can find on the ribbon.
But if you use completely separate template for mobile devices, the Resizer will not be your best friend. Maybe Google wanted to focus on a single template?

Google provides also other free interesting tools and resources on design.google.com/resources. You can find about 900 system icons library free to use, Device Metrics – reach comprehensive resource for multiple devices

Lets start using Resizer and exchange our experience.

Google Page Rank

Google has confirmed it is removing Toolbar PageRank

It’s official: Google confirmed they has decided to remove the Toolbar PageRank from its browser.

Google Page Rank

That means that if you are using a tool or a browser that shows you PageRank data from Google, it will begin not to show any data at all.

The visible PageRank has been removed for some of the Google toolbar users.. for the rest of them it will take some time while the removal roll out continues. Probably you have already noticed that your Google PageRank (in the browser) is showing as not available…. or it is going to be not available soon.

Google explained that the they still uses PageRank data internally within the ranking algorithm, only the external PageRank shown in the Toolbar will be completely removed.

Does PageRank removal will change anything on SEO or Site Owners?

Google explain, that there are no changes on how Google is using PageRank, after its removal from the Toolbar.  It will not reflect on the how the sites will show up in the search results.

We can still be able to get information about the site through Search Console (presence, content, pointing, etc). There are lots of other Google tools, toolbars and extensions that continue to help us to understand our rank search result position.

 

 

 

 

 

seo effect

How to delete a web page without giving negative SEO effect?

When deleting a page (or publication) from your website, you remove at least one (sometimes more) URL address. Subsequently, when this address is visited, it is usually returned to the user 404 error “page not found”.

What do you think? Is it better, if this page is redirected to an existing page? In the event that completely secure and deliberately you decided to removed the content, its more correctly showing the code 410.
You can find some various options below.

When the best option is to redirect the page and when is the best to completely delete that page?

What to use – 404 error or a 301 redirect?

301 redirect and 404 error
The first thing you need to consider is whether there is an equivalent of the deleted content we anywhere in the site. If there is any similar page with content  that can meet the expectations of visitors, the better action is to redirect deleted URL to the page with relevant content.
Even if only a small part of the visitors to benefit from this shift, it is better than absolutely everyone to show a message to a nonexistent page.

Create redirection (redirect)

When you create a redirect from the removed URL to another, make sure you use 301 redirect. There is several redirect options, and redirect 301 is the one that indicates the redirection is permanent and NOT only temporary. Using 301 redirection, Google and other search engines will transfer the accumulated page rank from the old URL to the new URL.

When it is appropriate to delete a page forever?

It is important to know whether there is another page with  the same or similar information on the website. If there is no such page, we should seriously think about: Is it good for the website to delete the page or it is better just to enrich the content?

If you finally decided to delete a page, use a proper code 410 – deleted content.

What is the difference between 404 error and 410 error?

410 code error

Error 404 means “content not found”, but error 410 means “content deleted”.

These specific codes help Google understand that the given URL should be finally removed and or the page should be removed from the index only.
If you decide to use code 410…one problem could occur, because in the statement of Google Search Console (GWT) it will stay under the category “Not found” just like any ordinary 404 pages.

What is the “bad thing” if the page deleted?

Often the decision to delete one or more posts or pages, from the site, leads to the subsequent indirect negatives. Let’s say you deleted all text/articles that are associated with a particular label. After this label is already empty, the URL of its archive also will return 404 error. Even after proper treatment of these publications (redirecting or deleting with code 410), the archive of the label will continue to show 404 error, so you will have to take specific action with respect to that URL.

Even, in the case, you do not deleted everything associated with that label, it can be a problem. For example: a page is displayed up to 10 publications. Before you had 12 that were shown on 2 pages, but now after removing some of them, they are already 5. The second page will display 404.

Finally we can say that if we delete 2-3 posts … this is not the biggest problem, but we know that Google’s Panda exist and we should not allow to increase of 404 errors on our website.

 

 

Panda Google Update

Do we really understand Panda update of Google?

Google Panda is one of the filters that Google uses in determining the rank of pages with low quality content, which in turn rank – up pages with valuable and quality content. Often, however, this algorithm remains misunderstood.

“Panda algorithm, which applies to all sites is the key signal to be taken into account when determining page rank. It measures the quality of a website and more information can be found in these instructions. Panda allows Google to take into account the quality and include it in the overall assessment of the ranking.”

Google

There are many SEO optimizers applying reverse engineering to the Panda algorithm, exploring the “winners” and “losers” sites in each update, but compliance is simply not the equal causation.

There are many myths about the Panda update, which people take for facts and their desire to “fix” the punishments imposed by the algorithm leads to even more problems for the sites. One example of the common wrong practice is the removing of entirely low quality content just to improve the ranking.

As put in the mix the current state of growing slowly update it is harder to identify possible causes of influence from Panda on the sites themselves, and whether there is any at all, or this is just another tool from Google, they changed every year.

So here’s what is known about Google Panda, confirmed by themselves, so that the webmasters affected by the algorithm to recover in time.

The impact of Panda on the content

Rethinking on the general content and Google Panda

By Google recently advised not to remove the contents to prevent Panda. Still some SEO experts disagree with this statement (even quite loud) but there are a lot of sense. Removing content simply because you think that is bad for the site in terms of Panda, can create serious problems with visibility in the search engine.       

 

Remove the wrong content

The biggest problem occurs when the webmasters decided to remove content that could be a value according to Google (ie it brings traffic and good rank). If you do not view any part of the contents and compare it with Google, it is possible to delete the content that performs well, but just more specific and less easily visible theme.

You can use Search Analytics in Google Seach Console, to see which pages have high traffic and determine where there is a need to change the content.

By Google advised to review the searches and where it does not match the content on your pages to make some relevant changes.

As a whole removing the wrong content can have a serious impact on your site because both remove content that Google has considered as good one and wasting traffic from these pages. So it is very easy to think that your site loses traffic due to Panda algorithm or someone else, but actually you are the reason, removing pages.

There is also the possibility another search engine to consider your content for great and also lose traffic, eliminating it.

 

Remove or correct content

As already mentioned, the removal of content is not the best solution. So you should not use the strategy “Remove all”, which is the most often suggestion given you by many experts. Some time ago, the decision was to add new quality content and to change the old, but not erased. If someone was determined to remove content it just being prevent to indexed by using robots.txt or NOINDEX.

In fact, this is even the only measure action proposed by Google in case of low-quality content – it simply should not be indexed, but not to eliminate completely.

Non-indexed content can still be found, but each user personally have to look for it on the site, and therefore you can also track whether these pages create natural flow of traffic on the site. Then you can see if the content is incorrectly noted as weak and should be indexed.

There is an option temporarily to stop indexing of specific pages with poor content and it could be changed. It was recommended by Google back in 2011, before being officially released Panda algorithm.

And with regard to low-quality pages, it is better just to avoid creating such, ie if they do not fall into category pages with unique content, useful for the user because of which they will trust you in the future.

 

When removal of content is necessarily

After all written above, there are still cases in which the removing of content is the only option. For example, if a forum has a glut of spam posts that can not be changed for better.

The same goes for a website that posts the RSS feeds only and if the webmaster does not have time to change all of these published articles in a new and interesting way.

 

Adding new and quality content

Adding new content on the site is a great solution for Panda also. The site always wins new things, even if Panda negatively affect some parts.

 

Affected by Panda sites still get ranking

A common mistake regarding Panda is that if one website is hit by Panda, it did not rank. This of course is not true.

Most people see the sites with terribly poor quality content and think that the whole site is weak. If a website has quality content on some pages, the website continues to be shown in the results.

Google also confirmed:

“Panda update may continue to show such websites in specific and precise requests, but visibility will be limited in search, which will not benefit either the site owner or consumers.”

There is another reason Google to advise against eliminating poor quality content. Eliminating spam is a plus, but if the quality of some pages is low, just add new content and new pages.

 

Is it necessary all pages to have high quality content to be the website good for Panda?

Sites that have pages affected by Panda can still be find in the search results. It is important the majority of the contents to be with high quality. There is no a formula or a special recipe about it.

Quality content improves ranking.

“In general, high-quality sites provide great content to consumers in most of the pages.”

Google

Site with several irrelevant pages can still be seen as a great website, but always – better to improve the content. This is not something that webmasters should be concerned. It is important however these pages to be a few.

Consumer expectations

Do you justify the expectations of consumers? It is important, when your website is ranked according to key words, the content to suit to these words. You may have a great text, but if your pages are shown as a result of requests that do not meet the expectations of people on Google, this is a signal of poor quality content.

“At the end of the day it is not important how many visitors you have at any given time, and how they’ve been helpful.”

Google

Duplicated content

Many people believe that the filter for duplicated content (yes – filter, not punishment) is the heart of Panda, but it is not. The algorithm and duplication are two different things.

However it is good to remove duplicates, especially if there are many in number, because it would lead to interference in your site.

In terms of SEO, duplicates are a lower priority to the providing highest possible quality content. Google said that if you need to prioritize time for a site finally will be cleared of duplicate content. Of course, it is important to optimize, but this factor is not critical. The algorithms  deal with this problem without problem. For example: if you buy a car it is natural to want it to look good, but if there are serious technical failures – then this will be your priority.

In other words, internal duplicated content is not of great importance, especially in the world of WordPress, where less experienced people often duplicate pages. This is only a small part of the algorithmic puzzle, although there is quite a serious role.

 

Similar content

Especially for sites about “How to …” it is important to monitor content, because quite often there are similar pages. Quite often if there are coincidences and similar elements on the sites their quality has dropped significantly.

For example, the site eHow has over 100 articles (maybe even more) on “How to flush the toilet.” Unless the site says uncloggingtoilets.com (otpushvanenatoaletni.com) is desirable some of these pages to be deleted or combined – it is possibly even Google does not submit any traffic to these 100 pages.

Check references to these pages and see which implement traffic because it is a sign that they have enough quality to qualify. But pay attention to those that have none.

 

Error 404 – page not found

One of the most common mistakes – 404 – occurs in case of a problem with crawling or been deleted. This error does not affect the Panda in any way.

 

Aggregated content

Google is not OK  websites with aggregated content, with a few exceptions.

“One thing you really need to consider is that our algorithms focus on unique, compelling and high-quality content. If a site just piled links to other sites, this site has no value, showing in search engine – it will be better for the users  simply to visit a specific site directly. So rather than just focus on technical malfunctions such as crawling and indexing, we recommend you go a step back and consider the model of the site. “

Google

There are exceptions to this rule – Techmeme.com is an example of an aggregator which operates successfully in the network to Google.

 

Look at the site with new eyes

For a webmaster it is important to determine whether the content of a website is really high quality. Sometimes it is necessary someone who is not directly linked to the site just to visit it and share opinion.

Many SEO experts think their website consist amazing  and unique content, whether it is written by the or someone, but sometimes their judgment could be subjective.

 

Number of words and on-page factors

Rethink the number of words

On the one hand it is a good idea to provide content over a certain number of words,  but on the other hand if you have a short content, it is not necessarily to be hit by Panda. In fact, there is a lot of content that Google will not only considers quality one, but will reward it set up its own snippet.

If only the number of words is important, then the game becomes very easy for spammers. So it means that in number of words does not mean “content of n-number of words because Panda”, and simple advice to optimize as it possible those parts of the site.

 

Information snippet on Google

The theory states that content less than 250 words is “weak”, but if that site was removed such content because it is short, the website would not be ranked ahead, and would lose it snippets.

 

Advertising & Partners (Affiliates)

Advertising and affiliate links affect the Panda, but only because divert attention from the main sites content.

The webmasters should not be interested in the number of visitors to a site, but rather how many of them were helpful. This is often at odds with the real situation. Too many sites are trying to get more benefit from the user than the serve the user when he/she went to the site.

Partner sites generally are not the alert and Panda check them harder than ordinary non-affiliate site. Many affiliate sites simply do not create content that Panda approved. The same can be said for pages that simply bother visitors with mass advertising content.

Advertisement

Like Google’s guidelines said about advertising: as long as your advertisement do not intrude on people, there is no Panda problem. There is also an algorithms for the page layout, positioning of ads on the pages and how the  pages is falling from the visible part of the displayed area.

A key point is the way in which the consumers notice the advertisements.

People often see too many ads in the visible part of the contents and it is a prerequisite for punishment by the Panda update, but this issue is deals by another algorithm called TOP HEAVY. TOP HEAVY pursued sites displaying ads very visible in the displayed area that make the user scroll down to see the content.

 

Content generated by users

This type of content is often a matter of discussion. Many SEO experts recommend removing it, because they believe according to Google it deteriorates the quality. But this is far from the truth.

There are a lot of sites where the primary content is generated by consumers, and the rank is great and the site is well situated in the results at Google.

Search engine believes that there is nothing wrong with quality sites with user-generated content and you should not believe on the motto “You have to delete everything written by visitors.”

 

Forums

When talking about forums it is important to pay special attention on it, because the removal of content from blogs, as a preventive measure against Panda, is not the best solution. If this happens, we go to the imposition of censorship and appeared quite another problem. You can have a forum with great content but the user content is not on a good level, but nevertheless there is a chance it to be useful. If you start to remove the content generated by the visitors, they may abandon you.

 

The titles in the forums

It is desirable to give freedom of moderators/admins to change some titles in the forums. Often the topics start with the title “HELP” or “What should I do?” but they are still unclear. Rewriting the title is not just for the search engines, and to be in help of all users. Google want to sends you traffic, but you need to show relevant results for people.

Another excellent recommendation for improvement of titles (type as “Help”) is to add some terms as “decisions” or similar. Even Google rewrites headlines and do not guarantee that will show yours, but your chances increase because doing this you give a clear request that you want to be helpful to people.

You can add it at the beginning or the end of the title – some people do not want to risk by changing the location of keywords and title tag – but do so that it can be seen in search results, especially if the title is longer. This is a thing that works and rank well for many sites. If someone searches for a specific problem and see in the header tag that there is a solution that will automatically win a visitor.

Comments

The comments could be another problematic area in terms of the Panda update. Good comments indicate that a page can be very interesting to the consumer but the bad comments can ruin the reputation of the entire page.

Even the news included in Google news can be dropped, if the algorithm noticed they received too many comments. In this case, the article begins to be copied too much and the value of the original article is dropping down.

Comments definitely can be affected by Panda, depending on their type and how are moderated.

Web service Disqus, for example, has the option the comments to not be part of HTML code of the page. WordPress has an option to put paging of comments, that avoiding the situation of hundreds of poor quality comments that harm the reputation of the entire page.

Comments as a positive signal

Good comments can also be in great help, in terms of the Panda update. If you have quality comments, they enhance the prestige of the page, show the interaction with users and reflect the actual popularity of the content.

 

Spam comments and comments escalated into personal conflict

This type of comments are bad for your your webpage and the website as a whole. If you leave comments to multiply, not moderate them and they are filled with spammers and people who just insult rather than comment on the subject will definitely bring your weak authority for all content.

What do you think the Panda will do  when the see the quality content at the top of the page and then it was followed by lots of spam comments. Is the final decision of the algorithm will be positive or negative?

If comments below the article are poor, it may be best to be blocked. It is very important to follow a strict policy regarding comments. If people start to associate your company with ugly and empty comments, it does not help your business.

 

Comments can affect entire websites

Sometimes even spam comments affect entire sites. Usually it happens with sites that do not offer so quality content. In such a situation the bad comments could affect even pages that have not been commented.

 

Remove all comments

This practice is observed lately from some experts, that recommend removing of all comments as a measure against Panda update.

The decision like this leads to miss some benefits. First is that, if the content is good,  it will not be possible the good comments to be seen and the commitment responsible for the raise of the website prestige. Second, the value of the article will not reach its maximum level, because many users visit again and again the article just to follow the opinions of others.

The absence of any comments also is not good for the site, a user who searches for information on a given subject will not be able to see the reaction of other people interesting of the same topics. Often comments serve to clarify things from the article.

Also the presence of comments can help search engine rankings your website on the keywords.

Loading speed

Many people are still wondering whether loading speed of the website is not a potential signal. But the loading speed of pages is officially recognized by Google as a ranking factor. This is not the strongest signals, as links and content, but still is important.

User engagement

No evidence of direct interaction between user engagement and Panda, although many experts believe the likes and sharing are indicators for quality.

The only possible signal that Panda can deduct is based on the user comments, which is a reason again to think about keep or remove them.

Advantages of TLDs (top-level domains)

Although some TLD domains have more spam than others, Panda does not look different content depending on what kind of TLD has developed. Since Google does not give preference to sites because as .com or .gov.

 

And again:

Remember that Panda deals with the content rather than technical problems. Much of the trouble you have, should be because of technical issues, but no matter how hard you try to correct them, it will not help you.

Simply create a unique and quality content and the Panda punishment will stay away from you.