301 redirect

301 редирект – Google променят отново правилата на играта

Новите правила, които Google реализира относно редиректите от типа 30Х, променят начина, по който това се отразява на класирането на страниците (Page Rank). Ето основните разлики в тези редиректи преди и след въвеждането на новите промени:

Стари SEO правила за редирект от типа “30Х”:

  • Създаването на редиректи 301 суммарно предава около 15%  от ранга на всички индексирани и в последствие пренасочени страници.
  • Редирект от типа 302 не предава Page Rank, тъй като характеризира временно пренасочване на страници към нов домейн, например.
  • При създаване на редирект на страници, защитени с протокола HTTPS, се предава PageRank, тъй като в много голяма степен това пренасочване се основава на редиректи от типа 301.

 

Нови правила за пренасочване от тип “3xx”:

Всички промени, които бяха направени от Google, са свързани основно с използването на протокола HTTPS вместо HTTP с цел по-голяма зашита на информацията в страниците, които се пренасочват.

 

Ето нагледно как промените в подхода на Google се отразяват в предаването на PageRank, при използването на редиректи от типа “3xx”:

 

Така възникват въпросите:

Използването на редирект 301 във всички случаи, не представлява ли опасност от загуба на трафик?

И отговора е НЕ. В идеалния случай отговора на сървъра, който е получил код за редирект 301, препраща потребителя на точно копие на новата страницата, която той търси. Единственото, което се сменя в случая е URL на страницата, което не води задължително до загуба на трафик. Но не всяко пренасочване е толкова безболезнено за Page Rank. Ако се направи логически несвързано пренасочване на страници, задължително ще доведе до наказание. Например ако се пренасочи страница посветена на творчеството на даден поет, към лендинг страница свързана с продажбата на продукт със сходно име.

Не отдавна Гленн Гейб потвърди, че Google може да възприеме страница със остаряло съдържание като «мека» грешка 404. Това означава, че пренасочването на такава страница не носи тежест в Page Rank и съответно в резултатите за търсене ще се класира като “незначителна”.

Дали използването на редирект 302 е по-добре от 301?

И отговора отново е НЕ! Неотдавна стана известно, че Google оценява пренасочване тип 302 (временно преместен) и пренасочване тип 301 (преместен за постоянно) като равнозначни. Това доведе масово до използването на редирект 302 вместо 301, което път от своя страна доведе до хаотично подреждане на страници в резултатите.

Въпреки, че Google възприема редиректи 302 и 301 като равнозначни, съществуват редица фактори, свързани с Page Rank, с които всеки SEO трябва да се съобрази при избора на редирект:

1. Несериозно е да се възприема, че използването на редиректи 301 и 302 е равнозначно във всички ситуации. Преди отговора на сървъра, при редирект 302 предаваше PageRank и то след значителен период от време, след прилагането на редиректа. Докато при използването на редитект 301, предаването на PageRank става след значително по-алък период от време след имплементирането му. Към момента няма точни данни след какъв период от време използването на редирект 302 започва да предава PageRank.

2. Освен това редирект HTTP 302 («временно преместен») е аст от уеб стандарт, който се използва и от други големи играчи в онлайн пространството като Baidu, Bing, DuckDuckGo и социални платформи като Facebook, Twitter, което не бива да се забравя напълно.

Какво казва Rand Fishkin по темата:

Възможно ли е да се съхрани целия натрупан трафик, ако се пренасочи сайта към безопасния протокол HTTPS?

Возможно е. Google се стреми всички сайтове да започнат да използват HTTPS протокола. Основния проблем се заключава в това, че използването на редирект 301 загубва 15% от натрупания преди пренасочването трафик.

 

Chromium project of Google

Each website become webVR with Chromium project of Google

The new big target of Google is to allow each website to be visible as webVR with the new project Chromium.

WebVR support is available as experimental flag in chrome://flags/#enable-vr-shell only. But is expecting to change the way of how we browse the Internet, in the near future. In order to enable access to the WebVR APIs in these builds you must turn on the “Enable WebVR” flag in about:flags or launch from the command line with –enable-webvr.

François Beaufort from Google said, the Chrome Beta and Chrome Dev channels will have additionally added new settings, that the user will be able to activate and change the way of internet surfing into virtual reality, using Cardboard or Daydream-ready viewers.

And this is just the beginning. The entirely new platform of google Daydream (for high quality mobile virtual reality) coming in Fall 2016.

WebVR will be the new way of internet surving and visuailization of the information.

Chromium project of Google

Google – всеки сайт видим във виртуалтана реалност

С новия си проект Chromium, Google се стреми да направи всеки уебсайт видими в VR, т.е. всеки уебсайт WebVR.

С новите технологии залегнали в проекта Chromium, се очаква коренна промяна на начина, по който сме свикнали да работим и сърфираме в интернет в близко бъдеще

Според François Beaufort от Google, каналите Chrome Beta и Chrome Dev ще имат нови допълнителни настройки, които ще позволяват на потребителя да сърфира в интернет пространството посредством устройства за виртуална реалност като Cardboard или Daydream-ready viewers.

Можете да активирате експиремталния флаг в Chrome ot tuk: chrome://flags/#enable-vr-shell. За да активирате функцията WebVR APIs кликнете върху “Enable WebVR” flag в “about:flags” или я стартирайте чрез командата –enable-webvr.

И това е само началото. Все повече съдържание ще продължава да се появява достъпно чрез тази технология.

Новата платформа на Google за виртуална реалност Daydream, се очаква да бъде достъпна през есента на 2016.

Виртуална реалност като част от интернет браузера на Google се очертава да стане неизменна част от интернет пространството и да промени начина по който ще се представя информацията там.

keywords search tool

The 3 Best Keyword Research Tools for SEO

As you know the keywords are not the most powerful SEO tool any more.
They receive less but more qualified traffic and users that are typically further down their path of intent.

So choosing the right keyword phrases for your website pages is important and really easy when you use the right tools to perform your keyword research.

Two general aspects need to be taken in advanced, when start choosing the right long-tail keywords for your business. They are:

Keyword Relevance

Relevance is the most important factor. More specific you are, the better.

For instance, if you own a company that installs waste management systems, you need targeting a keyword such as “fiberglass in-ground waste management installation” rather than “waste management installation”. This will increase the chance to atract customer searching for “fiberglass in-ground waste management installation”, which is much more specific for your business!

Sure, optimizing for “waste management system” has its place. But be sure that this variation of keyword string will attract a much more generic audience that may not be looking for what you have to offer. The solution is to use more relevant, long-tail keywords.

Location-Based Keywords

When looking for products or services in their specific area in Google, the users usually include their location in the search. So, your keyword phrases becomes “fiberglass in-ground waste management installation in Plovdiv”.

If you work in one specific geo-location, it’s good to add location-based keywords to all of your pages.
If you operates in several geo-locations, it is better to create a separate web page dedicated to each location. This will attract more visitors to your when searching for individual locations.

So how to choose the right keyword phrases for your business? Here are some useful free and payd keywords research tools.

Free Keyword Research Tools
1) Google Keyword Planner

Keyword Planner is a free AdWords tool that helps building Search Network campaigns by finding keyword ideas and estimating how they may perform.

It allows search for keyword and ad group ideas, get historical statistics, see how a list of keywords might perform. Using Keyword Planner can create a new keyword list by multiplying several lists of keywords together. Since it’s a free AdWords tool, it can also help you choose competitive bids and budgets to use with your AdWords campaigns.

2) Google Trends

Google Trends uses real-time search data to follow consumer search behaviors over the time. and shows how any new search could affect popularity in the search result.

3) Keyword Tool.io

Keyword tool helps to understand what your audience searches in Google search engine.

Google ranking

Study said: outbound links affect the ranking of sites in Google

Recently Reboot Online published the results from their research on: How the outbound links affect the position of websites in Google search results, despite contrary statements by the representatives of the search.

What is outbound links?

Outbound links are links that are meant to take you elsewhere. These are links that are going to direct you to another specific webpage or website altogether. There are two kinds of outbound links – dofollow link and nofollow links.

What did they do?

The study covers 10 new domains that have been bought. All of the 10 new domains were registered at one and the same time to eliminate the influence of the age domain factor.

On these news sites several articles have been published. Two highly recommended/respected relevant resources have been mentioned (included) in each one of these articles. As at 5 sites the resources have been mentioned as text only, and at the 5 other sites – as hyperlinks, as follow.

1) aveoningon.co.uk – does not contain outbound links
2) bistuluded.co.uk – contains outbound links
3) chotoilame.co.uk – does not contain outbound links
4) dyeatimide.co.uk – contains outbound links
5) edikatstic.co.uk – does not contain outbound links
6) foppostler.co.uk – contains outbound links
7) gamorcesed.co.uk – does not contain outbound links
8) heabasumel.co.uk – contains outbound links
9) iramebleta.co.uk – does not contain outbound links
10) jundbaramn.co.uk – contains outbound links

All of these sites were designed with a similar, but not identical structure.

Web crawlers access to the domains were blocked up for as long as the content of the sites have not been published. This stem was important to be sure that Google will indexes the content of all websites at one and the same time.

 

Once the web crawlers access have been unlocked and the sites were indexed, Reboot Online staff provided several searches for key phrases and recorded the results as screenshots. The ranking progress were monitored for 5 months.

 

Ranking results

It was not a big surprise that the presence of a outbound links to reputable sites (in the analysed websites) have a positive impact on the site ranking.

Heat map

Heat maps were created based on the ranking results, taken for one day of testing. Green squares indicate that the site held the expected position in SERP Google for the keyword. A red squares (or red gradient) denotes a site held low position. The further away it is from the mid-point, the darker red the square is.

The statistics for the keywords [Phylandocic] и [Ancludixis] have been published:

Keywords [Phylandocic] results

Phylandocic

 

Keyword [Ancludixis] resultsAncludixis

Websites ranking position

Website ranking positions for the same keywords were presented. These diagrams show the positions of sites in the Google rankings. The blue line shows the site position in the Google ranking – for sites contains outbound links. The orange line shows the site position in the Google ranking – for sites do not contain outbound links.

Keywords [Phylandocic] results

Phylandocic

 

Keyword [Ancludixis] resultsAncludixis

 

 

Google Page Rank

Google has confirmed it is removing Toolbar PageRank

It’s official: Google confirmed they has decided to remove the Toolbar PageRank from its browser.

Google Page Rank

That means that if you are using a tool or a browser that shows you PageRank data from Google, it will begin not to show any data at all.

The visible PageRank has been removed for some of the Google toolbar users.. for the rest of them it will take some time while the removal roll out continues. Probably you have already noticed that your Google PageRank (in the browser) is showing as not available…. or it is going to be not available soon.

Google explained that the they still uses PageRank data internally within the ranking algorithm, only the external PageRank shown in the Toolbar will be completely removed.

Does PageRank removal will change anything on SEO or Site Owners?

Google explain, that there are no changes on how Google is using PageRank, after its removal from the Toolbar.  It will not reflect on the how the sites will show up in the search results.

We can still be able to get information about the site through Search Console (presence, content, pointing, etc). There are lots of other Google tools, toolbars and extensions that continue to help us to understand our rank search result position.

 

 

 

 

 

seo effect

How to delete a web page without giving negative SEO effect?

When deleting a page (or publication) from your website, you remove at least one (sometimes more) URL address. Subsequently, when this address is visited, it is usually returned to the user 404 error “page not found”.

What do you think? Is it better, if this page is redirected to an existing page? In the event that completely secure and deliberately you decided to removed the content, its more correctly showing the code 410.
You can find some various options below.

When the best option is to redirect the page and when is the best to completely delete that page?

What to use – 404 error or a 301 redirect?

301 redirect and 404 error
The first thing you need to consider is whether there is an equivalent of the deleted content we anywhere in the site. If there is any similar page with content  that can meet the expectations of visitors, the better action is to redirect deleted URL to the page with relevant content.
Even if only a small part of the visitors to benefit from this shift, it is better than absolutely everyone to show a message to a nonexistent page.

Create redirection (redirect)

When you create a redirect from the removed URL to another, make sure you use 301 redirect. There is several redirect options, and redirect 301 is the one that indicates the redirection is permanent and NOT only temporary. Using 301 redirection, Google and other search engines will transfer the accumulated page rank from the old URL to the new URL.

When it is appropriate to delete a page forever?

It is important to know whether there is another page with  the same or similar information on the website. If there is no such page, we should seriously think about: Is it good for the website to delete the page or it is better just to enrich the content?

If you finally decided to delete a page, use a proper code 410 – deleted content.

What is the difference between 404 error and 410 error?

410 code error

Error 404 means “content not found”, but error 410 means “content deleted”.

These specific codes help Google understand that the given URL should be finally removed and or the page should be removed from the index only.
If you decide to use code 410…one problem could occur, because in the statement of Google Search Console (GWT) it will stay under the category “Not found” just like any ordinary 404 pages.

What is the “bad thing” if the page deleted?

Often the decision to delete one or more posts or pages, from the site, leads to the subsequent indirect negatives. Let’s say you deleted all text/articles that are associated with a particular label. After this label is already empty, the URL of its archive also will return 404 error. Even after proper treatment of these publications (redirecting or deleting with code 410), the archive of the label will continue to show 404 error, so you will have to take specific action with respect to that URL.

Even, in the case, you do not deleted everything associated with that label, it can be a problem. For example: a page is displayed up to 10 publications. Before you had 12 that were shown on 2 pages, but now after removing some of them, they are already 5. The second page will display 404.

Finally we can say that if we delete 2-3 posts … this is not the biggest problem, but we know that Google’s Panda exist and we should not allow to increase of 404 errors on our website.

 

 

Panda Google Update

Do we really understand Panda update of Google?

Google Panda is one of the filters that Google uses in determining the rank of pages with low quality content, which in turn rank – up pages with valuable and quality content. Often, however, this algorithm remains misunderstood.

“Panda algorithm, which applies to all sites is the key signal to be taken into account when determining page rank. It measures the quality of a website and more information can be found in these instructions. Panda allows Google to take into account the quality and include it in the overall assessment of the ranking.”

Google

There are many SEO optimizers applying reverse engineering to the Panda algorithm, exploring the “winners” and “losers” sites in each update, but compliance is simply not the equal causation.

There are many myths about the Panda update, which people take for facts and their desire to “fix” the punishments imposed by the algorithm leads to even more problems for the sites. One example of the common wrong practice is the removing of entirely low quality content just to improve the ranking.

As put in the mix the current state of growing slowly update it is harder to identify possible causes of influence from Panda on the sites themselves, and whether there is any at all, or this is just another tool from Google, they changed every year.

So here’s what is known about Google Panda, confirmed by themselves, so that the webmasters affected by the algorithm to recover in time.

The impact of Panda on the content

Rethinking on the general content and Google Panda

By Google recently advised not to remove the contents to prevent Panda. Still some SEO experts disagree with this statement (even quite loud) but there are a lot of sense. Removing content simply because you think that is bad for the site in terms of Panda, can create serious problems with visibility in the search engine.       

 

Remove the wrong content

The biggest problem occurs when the webmasters decided to remove content that could be a value according to Google (ie it brings traffic and good rank). If you do not view any part of the contents and compare it with Google, it is possible to delete the content that performs well, but just more specific and less easily visible theme.

You can use Search Analytics in Google Seach Console, to see which pages have high traffic and determine where there is a need to change the content.

By Google advised to review the searches and where it does not match the content on your pages to make some relevant changes.

As a whole removing the wrong content can have a serious impact on your site because both remove content that Google has considered as good one and wasting traffic from these pages. So it is very easy to think that your site loses traffic due to Panda algorithm or someone else, but actually you are the reason, removing pages.

There is also the possibility another search engine to consider your content for great and also lose traffic, eliminating it.

 

Remove or correct content

As already mentioned, the removal of content is not the best solution. So you should not use the strategy “Remove all”, which is the most often suggestion given you by many experts. Some time ago, the decision was to add new quality content and to change the old, but not erased. If someone was determined to remove content it just being prevent to indexed by using robots.txt or NOINDEX.

In fact, this is even the only measure action proposed by Google in case of low-quality content – it simply should not be indexed, but not to eliminate completely.

Non-indexed content can still be found, but each user personally have to look for it on the site, and therefore you can also track whether these pages create natural flow of traffic on the site. Then you can see if the content is incorrectly noted as weak and should be indexed.

There is an option temporarily to stop indexing of specific pages with poor content and it could be changed. It was recommended by Google back in 2011, before being officially released Panda algorithm.

And with regard to low-quality pages, it is better just to avoid creating such, ie if they do not fall into category pages with unique content, useful for the user because of which they will trust you in the future.

 

When removal of content is necessarily

After all written above, there are still cases in which the removing of content is the only option. For example, if a forum has a glut of spam posts that can not be changed for better.

The same goes for a website that posts the RSS feeds only and if the webmaster does not have time to change all of these published articles in a new and interesting way.

 

Adding new and quality content

Adding new content on the site is a great solution for Panda also. The site always wins new things, even if Panda negatively affect some parts.

 

Affected by Panda sites still get ranking

A common mistake regarding Panda is that if one website is hit by Panda, it did not rank. This of course is not true.

Most people see the sites with terribly poor quality content and think that the whole site is weak. If a website has quality content on some pages, the website continues to be shown in the results.

Google also confirmed:

“Panda update may continue to show such websites in specific and precise requests, but visibility will be limited in search, which will not benefit either the site owner or consumers.”

There is another reason Google to advise against eliminating poor quality content. Eliminating spam is a plus, but if the quality of some pages is low, just add new content and new pages.

 

Is it necessary all pages to have high quality content to be the website good for Panda?

Sites that have pages affected by Panda can still be find in the search results. It is important the majority of the contents to be with high quality. There is no a formula or a special recipe about it.

Quality content improves ranking.

“In general, high-quality sites provide great content to consumers in most of the pages.”

Google

Site with several irrelevant pages can still be seen as a great website, but always – better to improve the content. This is not something that webmasters should be concerned. It is important however these pages to be a few.

Consumer expectations

Do you justify the expectations of consumers? It is important, when your website is ranked according to key words, the content to suit to these words. You may have a great text, but if your pages are shown as a result of requests that do not meet the expectations of people on Google, this is a signal of poor quality content.

“At the end of the day it is not important how many visitors you have at any given time, and how they’ve been helpful.”

Google

Duplicated content

Many people believe that the filter for duplicated content (yes – filter, not punishment) is the heart of Panda, but it is not. The algorithm and duplication are two different things.

However it is good to remove duplicates, especially if there are many in number, because it would lead to interference in your site.

In terms of SEO, duplicates are a lower priority to the providing highest possible quality content. Google said that if you need to prioritize time for a site finally will be cleared of duplicate content. Of course, it is important to optimize, but this factor is not critical. The algorithms  deal with this problem without problem. For example: if you buy a car it is natural to want it to look good, but if there are serious technical failures – then this will be your priority.

In other words, internal duplicated content is not of great importance, especially in the world of WordPress, where less experienced people often duplicate pages. This is only a small part of the algorithmic puzzle, although there is quite a serious role.

 

Similar content

Especially for sites about “How to …” it is important to monitor content, because quite often there are similar pages. Quite often if there are coincidences and similar elements on the sites their quality has dropped significantly.

For example, the site eHow has over 100 articles (maybe even more) on “How to flush the toilet.” Unless the site says uncloggingtoilets.com (otpushvanenatoaletni.com) is desirable some of these pages to be deleted or combined – it is possibly even Google does not submit any traffic to these 100 pages.

Check references to these pages and see which implement traffic because it is a sign that they have enough quality to qualify. But pay attention to those that have none.

 

Error 404 – page not found

One of the most common mistakes – 404 – occurs in case of a problem with crawling or been deleted. This error does not affect the Panda in any way.

 

Aggregated content

Google is not OK  websites with aggregated content, with a few exceptions.

“One thing you really need to consider is that our algorithms focus on unique, compelling and high-quality content. If a site just piled links to other sites, this site has no value, showing in search engine – it will be better for the users  simply to visit a specific site directly. So rather than just focus on technical malfunctions such as crawling and indexing, we recommend you go a step back and consider the model of the site. “

Google

There are exceptions to this rule – Techmeme.com is an example of an aggregator which operates successfully in the network to Google.

 

Look at the site with new eyes

For a webmaster it is important to determine whether the content of a website is really high quality. Sometimes it is necessary someone who is not directly linked to the site just to visit it and share opinion.

Many SEO experts think their website consist amazing  and unique content, whether it is written by the or someone, but sometimes their judgment could be subjective.

 

Number of words and on-page factors

Rethink the number of words

On the one hand it is a good idea to provide content over a certain number of words,  but on the other hand if you have a short content, it is not necessarily to be hit by Panda. In fact, there is a lot of content that Google will not only considers quality one, but will reward it set up its own snippet.

If only the number of words is important, then the game becomes very easy for spammers. So it means that in number of words does not mean “content of n-number of words because Panda”, and simple advice to optimize as it possible those parts of the site.

 

Information snippet on Google

The theory states that content less than 250 words is “weak”, but if that site was removed such content because it is short, the website would not be ranked ahead, and would lose it snippets.

 

Advertising & Partners (Affiliates)

Advertising and affiliate links affect the Panda, but only because divert attention from the main sites content.

The webmasters should not be interested in the number of visitors to a site, but rather how many of them were helpful. This is often at odds with the real situation. Too many sites are trying to get more benefit from the user than the serve the user when he/she went to the site.

Partner sites generally are not the alert and Panda check them harder than ordinary non-affiliate site. Many affiliate sites simply do not create content that Panda approved. The same can be said for pages that simply bother visitors with mass advertising content.

Advertisement

Like Google’s guidelines said about advertising: as long as your advertisement do not intrude on people, there is no Panda problem. There is also an algorithms for the page layout, positioning of ads on the pages and how the  pages is falling from the visible part of the displayed area.

A key point is the way in which the consumers notice the advertisements.

People often see too many ads in the visible part of the contents and it is a prerequisite for punishment by the Panda update, but this issue is deals by another algorithm called TOP HEAVY. TOP HEAVY pursued sites displaying ads very visible in the displayed area that make the user scroll down to see the content.

 

Content generated by users

This type of content is often a matter of discussion. Many SEO experts recommend removing it, because they believe according to Google it deteriorates the quality. But this is far from the truth.

There are a lot of sites where the primary content is generated by consumers, and the rank is great and the site is well situated in the results at Google.

Search engine believes that there is nothing wrong with quality sites with user-generated content and you should not believe on the motto “You have to delete everything written by visitors.”

 

Forums

When talking about forums it is important to pay special attention on it, because the removal of content from blogs, as a preventive measure against Panda, is not the best solution. If this happens, we go to the imposition of censorship and appeared quite another problem. You can have a forum with great content but the user content is not on a good level, but nevertheless there is a chance it to be useful. If you start to remove the content generated by the visitors, they may abandon you.

 

The titles in the forums

It is desirable to give freedom of moderators/admins to change some titles in the forums. Often the topics start with the title “HELP” or “What should I do?” but they are still unclear. Rewriting the title is not just for the search engines, and to be in help of all users. Google want to sends you traffic, but you need to show relevant results for people.

Another excellent recommendation for improvement of titles (type as “Help”) is to add some terms as “decisions” or similar. Even Google rewrites headlines and do not guarantee that will show yours, but your chances increase because doing this you give a clear request that you want to be helpful to people.

You can add it at the beginning or the end of the title – some people do not want to risk by changing the location of keywords and title tag – but do so that it can be seen in search results, especially if the title is longer. This is a thing that works and rank well for many sites. If someone searches for a specific problem and see in the header tag that there is a solution that will automatically win a visitor.

Comments

The comments could be another problematic area in terms of the Panda update. Good comments indicate that a page can be very interesting to the consumer but the bad comments can ruin the reputation of the entire page.

Even the news included in Google news can be dropped, if the algorithm noticed they received too many comments. In this case, the article begins to be copied too much and the value of the original article is dropping down.

Comments definitely can be affected by Panda, depending on their type and how are moderated.

Web service Disqus, for example, has the option the comments to not be part of HTML code of the page. WordPress has an option to put paging of comments, that avoiding the situation of hundreds of poor quality comments that harm the reputation of the entire page.

Comments as a positive signal

Good comments can also be in great help, in terms of the Panda update. If you have quality comments, they enhance the prestige of the page, show the interaction with users and reflect the actual popularity of the content.

 

Spam comments and comments escalated into personal conflict

This type of comments are bad for your your webpage and the website as a whole. If you leave comments to multiply, not moderate them and they are filled with spammers and people who just insult rather than comment on the subject will definitely bring your weak authority for all content.

What do you think the Panda will do  when the see the quality content at the top of the page and then it was followed by lots of spam comments. Is the final decision of the algorithm will be positive or negative?

If comments below the article are poor, it may be best to be blocked. It is very important to follow a strict policy regarding comments. If people start to associate your company with ugly and empty comments, it does not help your business.

 

Comments can affect entire websites

Sometimes even spam comments affect entire sites. Usually it happens with sites that do not offer so quality content. In such a situation the bad comments could affect even pages that have not been commented.

 

Remove all comments

This practice is observed lately from some experts, that recommend removing of all comments as a measure against Panda update.

The decision like this leads to miss some benefits. First is that, if the content is good,  it will not be possible the good comments to be seen and the commitment responsible for the raise of the website prestige. Second, the value of the article will not reach its maximum level, because many users visit again and again the article just to follow the opinions of others.

The absence of any comments also is not good for the site, a user who searches for information on a given subject will not be able to see the reaction of other people interesting of the same topics. Often comments serve to clarify things from the article.

Also the presence of comments can help search engine rankings your website on the keywords.

Loading speed

Many people are still wondering whether loading speed of the website is not a potential signal. But the loading speed of pages is officially recognized by Google as a ranking factor. This is not the strongest signals, as links and content, but still is important.

User engagement

No evidence of direct interaction between user engagement and Panda, although many experts believe the likes and sharing are indicators for quality.

The only possible signal that Panda can deduct is based on the user comments, which is a reason again to think about keep or remove them.

Advantages of TLDs (top-level domains)

Although some TLD domains have more spam than others, Panda does not look different content depending on what kind of TLD has developed. Since Google does not give preference to sites because as .com or .gov.

 

And again:

Remember that Panda deals with the content rather than technical problems. Much of the trouble you have, should be because of technical issues, but no matter how hard you try to correct them, it will not help you.

Simply create a unique and quality content and the Panda punishment will stay away from you.

 

Cure article and publishing

3 малко познати инструмента за подбор и публикуване на съдържание

 

Новите онлайн инструменти, които се появяват и все още не са толкова популярни, променят начина, по който търговци и маркетинг специалисти събират, подбират и предоставят конкретно съдържание на аудиторията си през социалните канали.

Ето и три от тях.

1. Medium

medium

Medium е общност от читатели и писатели, които предлагат уникални перспективи създаване и споделяне на съдържание. Необходима е първоначална регистрация за да започнете да използвате платформата.

Medium е чудесна платформа за маркетинг, тъй като позволява бързо да се разпространява съдържание. Можете да пишете дълги статии, кратки публикации или туитове, както и да качвате, които да споделите с Medium общността. Когато публикуват статии, те биват споделяни от вашите последователи до други членове на мрежата, които търсят и намират съдържание на база поставени тагове и споделени мнения.

Medium също предлага възможност за подбор на съдържание, което много хора пренебрегват. Можете да създадете своя собствена публикация и да подпирате статии или автори от мрежата.

Създаване на публикация
За да създадете публикация, е необходимо да отидете на страницата за публикации (Publications page) и да натиснете New Publication (нова публикация). След това въвеждате детайлите за вашата публикация и я оформяте. Можете да изберете няколко стила на оформление (Grid, Stream, или List) и колко публикации да се появят на началната страница.

За да подберете съдържание, е необходимо да потърсите по ключови за вас думи, в Medium за статии, близки по тема и до посланието, което сте оправили във вашата статия.  След като намерите подходяща статия, която искате да публикувате, е необходимо да кликнете върху иконата с форма на елипса, която ще намерите в края на статията, и избирате Request Story от падащото меню. Това ще ви свърже с автора на статията по имейл и той ще ви отговори дали можете да публикувате статията му или не. Ако получите разрешение публикацията може да бъде добавена към вашата статия.

2. Twitter’s Curator platform

twitter curator platform

Някои от най-популярните Twitter  акаунти не споделят задължително свое съдържание или свои публикации. Те търсят най-добрите статии, туитове, видеоклипове, снимки и истории, свързани с конкретна тема и ги споделяте със своите последователи.

Twitter позволява да подбирате съдържание, да споделяте линкове или туитове публикувани вече от други. Първият подход е особено популярен. При него вие търсите онлайн съдържание и го споделяте.

Вторият подход е да използвате Twitter’s Curator platform, като въвеждате определени ключови думи, хаштагове или индивидуални потребители. На база на вашите критерии за търсене платформата връща резултати най-близки до търсеното от вас и то в реално време. Всеки един от тези резултати, ако решите, можете да споделите и вие до вашите последователи.

3. Clip Slides на SlideShare (Clipping)

Clipping on SlideShare

SlideShare наскоро пусна своя клипинг инструмент, който позволява да изрязвате и запазите най-добрите слайдове от презентации, за да ги разгледате или споделите по-късно. Това е чудесен начин да поддържате предварително подбраното от вас съдържание подредено по теми, което улеснява по-късното му споделяне. уреденото съдържание организиран от темата, така че да може да достави само най-добрите прозрения.

За да започнете, е необходимо да се регистрирате и да влзете в SlideShare , да кликнете върху My Clipboards от лентата за навигация.
На следващата страница кликнете на  Create a Clipboard. Въвеждате името и описанието на клипборда, който искате да създадете, дали да бъде публичен или не. Като името на създаденият от вас клипборд трябва да е в унисон (да е логически свързан) с тематиката на колекцията от слайдове, която създавате.

RankBrain Google

Как RankBrain алгоритъма ще засегне SEO през 2016 г.?

RankBrain Google
Наскоро Google обяви въвеждането на алгоритъм известен като RankBrain, който ще следи за показване на резултатите от търсенето.
В действителност, през последните няколко месеца, вече “значителен процент” от всички запитвания в търсачката на Google са били обработени от самата RankBrain.

Какво в действителност е RankBrain, какво се променя и къде всъщност е проблема за SEO специалистите?

RankBrain е вид изкуствен интелект, който се прилага в момента заедно с настоящия алгоритъм за търсене на Google, за да се осигури по-добро съвпадение на резултатите въз основа на потребителските заявки. RankBrain ще прилага машинно обучение и ще използва математически процеси и модели за разбиране на езиковата семантика на потребителя и постепенно ще научава повече за това как и защо хората търсят и ще прилага тези модели за подобряване на съвпаденията при бъдещи резултати от търсенето.Би следвало RankBrain да се възприема като машина с изкуствен интелект, която ще се самообучава и постепенно ще се самоусъвършенства в метода си за определяне на най-добри съвпадения като резултати при направено търсене от потребителя в Google.

Какво точно прави RankBrain?

Доколкото е известно, това е алгоритъм за машинно обучение, стартиран в началото на тази година. Неговите ефекти постепенно ще бъдат наблюдавани и те ще засегнат търсенето и визуализирането на резултатите в Google, макар и по един фин начин.

Ако още не сте успели да изгледате видеото от Bloomberg, го направете сега.

По-долу са изложени основни хипотези за това как RankBrain се очаква да се отрази на SEO процесите, в дългосрочен аспект.

 

Хипотеза 1: Поведението на потребителя постепенно ще се измества 

Тъй като резултатите от търсенето ще стават все по-точни и по-добри от гледна точка на потребителя, като ще се сведат до първите три резултата – независимо от вида списък – то все още ще е органично търсене и ще доведе до повече кликвания от потребителите върху показаните резултати.

Така вече няма да стои въпроса как да бъдем на първата страница в резултатите, а как да бъдем в първите 3 от списъка.

 

Хипотеза 2: Конкуренцията ще става все по-силна

Това, което RankBrain наистина прави е да приоритизира значимите резултати. “Играта” в показването на резултатите няма да бъде така приятелска за тези, които целят единствено по-голям трафик към сайтовете си. Всички статии, които са по-малко изчерпателни от подбраните първи 10x ще започнат да изпадат назад в класирането. Само най-добрите ще се класират – всички други посредствени резултати ще започнат да изпадат назад.

Търсенето ще бъде игра с нулев резултат. Нещо, което винаги е било така, но ще става още по-трудно.

 

Хипотеза 3: Машинното самообучение на RankBrain ще смаже спама и т.н. практики “черни шапки”

Можем да заключим, че това е основано на дългогодишно натрупани резултати и от части базирано на използван досега алгоритъм в Google. Като се има предвид посоката, в която се развива Google при борбата със спама (чийто ефект се усети доста силно през 2010 г.) е изненада, че RankBrain не е алгоритъм, който е насочен към блокирането на тактиките “черна шапка”.

Може да се преподположи, че ако RankBrain работи по-добре, отколкото са очаквали в Google, те ще използват положителния резултат за справянето със спама и черните шапки.

 

Хипотеза 4: Вие може да влияете на RankBrain

Google “храни” RankBrain с офлайн данни, което означава, че RankBrain не се учи в интернет, такъв какъвто е. Каквото Google сметне, че е достатъчно добро се дава на RankBrain. Така, че разпространяването например на термини като ‘Growth Hacking’ или ‘Inbound Marketing’ или ‘Link Earning’ всъщност може да е сигнал, че сте компетентен в сферата на тези понятия.

Ако това се подаде към RankBrain и той ви разпознае като източник на термина и авторитет по темата, би могло да доведе до положителен сигнал за вашия сайт и всичко, което е свързано с него. Това няма да бъде никак лесно да се прави, но определено е нещо, което би могло да повлияе на алгоритъма.

 

RankBrain и SEO през 2016

 Уеб сайтове, които са с лазерна точност фокусирани на подобни теми ще се радват на по-добър ранкинг и по-добро класиране в резултатите, но те  ще трябва да подобрят качеството на съдържанието (на текстовете) до ниво 10x.

Поддържане на силен фокус и въвеждане на технически подобрения в сайта, като например намаляване скоростта на зареждане,  използване на микроданни, и SSL ще се явява по-важно от всякога. Препратки към други теми или ниши ще стане по-малък приоритет за блогове и новинарски сайтове.

Ще е добре да се правят неща, които никога не умират в SEO. Cyrus Shepard от Moz е казал много точно:

  •  Пишете по-пълно и изчерпателно
  • Отговаряйте на повече въпроси
  • Отговaряйте на въпроса още след основната теза – т.е. на въпроси, които хората все още не са задали
  • Публикувайте достатъчно, за да се превърнете в авторитет по темата
  • Създавайте вечнозелени ресурси сред краткосрочните показвания на страници

Всичко това са все трудни неща за вършене, но следвайки ги вероятно ще успеете да останете в челните места за дълго време.Наистина все още не се знае много RankBrain, но от известното досега може да се изградят някои практични стратегии. Те не е като да са нови и неизползвани досега стратегии – но са такива, които са в основата на това, което винаги е бил Google.

Източник: www.forbes.com