Internet Of Things to replace Mobile Phones as most connected device In 2018

  • Internet of Things (IoT) is expected to exceed mobile phones as the largest category of connected devices in 2018, raising the compounded annual growth rate (CAGR) of 23%, and total forecast 28 billion connected devices by 2021. Massive IoT connections will leverage the ubiquity, security and management of cellular networks.

connected device worldwide

  • 400 million IoT devices with cellular subscriptions at the end of 2015. Cellular IoT is expected to have the highest growth among the different categories of connected devices, reaching 1.5 billion in 2021.

IoT connected device

  • 1.2 billion new subscriptions worldwide during Q1 2016.
  • 9 billion mobile subscriptions by 2021, 6.3 billion smart phone subscription and 7,7 billion mobile broandband subscriptions

mobile subscriptions

  • 5G networks, being driven by new use cases, are expected to be deployed commercially in 2020.  1.7B new subscribers in the the Asia/Pacific (APAC) region. Smartphone subscription rates will increase more than 200% between 2015–2021 in the Middle East and Africa.
  • 4.3 billion LTE subscriptions by the end of 2021

subscriptions by technology

  • Smartphone subscriptions set to almost double by 2021

smart phone subscriptions

 

These essentials and other disruptive facts were published in the 2016 Ericcson Mobility Report (PDF).

Ericcson created subscriptions and traffic forcast baselines and several interactive graphs. Using their plannning models Erricson validated trends and estimated future developmentfugures based on macroeconomic trends, user trends (researched by Ericsson ConsumerLab), market maturity, technology development expectations and industry analyst reports, on a national or regional level.

Reed the four suppary articles provided by Ericcson and the full Ericcson Mobile Report 2016.

Additional details on the Ericcson methodology, reed on page 30.

 

scroll tracking

Scroll Tracking в Google Analytics: Защо и как?

Четат ли хората съдържанието на сайта ви?

Интуитивният отговор е, “Е, разбира се! Очевидно  е, че го четат! Толкова попадения по търсени думи и изрази и толкова показвания на страници…! ”

Принципно е така … но как да разберете дали хората наистина четат съдържанието? Дали са прочели цялата статия или само първия параграф или само отделни моменти от нея?

Не би ли било чудесно ако има плъгин или друго средство, чрез което да можете да си отговорите на тези въпроси и да оптимизирате своето време и разходи в писане на текстове, които не се четат толкова колкото очаквате?

Как може да се разбере по-добре колко ангажиращо за читателя е съдържанието в нашите статии? Дали наистина ги прочитат изцяло? И отговора е:

Scroll Tracking

scroll tracking

Това е плъгин, който определя гранични стойности, основаващи се на височината на страницата, след това създава и проследява събития, които настъпват когато посетител превърти (скрулира) до определена позиция в страницата (т.е. скрулирайки намалява нейната височина съответно с 10%, 25%, 50%, 75%, 90% и 100%).  Резултатите от Scroll Tracking, за всички тези събития, могат да се видят в отчетите на Google Analytics, като напр.:

Категория на събитието (Event Category):  Проследяване на превъртането (Scroll Tracking)
Действие на събитието (Event Action):<% scrolled> (10%, 25%, 50%, и т.н.)
Етикет на събитието (Event Label): (/blog/super-amazing-content/)

 

 

Как се инсталира скрипта на Scroll Tracking?

С помощта на Google Tag Manager. Стъпка по стъпка:

1. Изтегляне на контейнери за файла (JSON файл).

2. Импортиране на JSON файла в GTM. Влезте в своя собствен Google Tag Manager, секцията Администриране на сайта. От  Container options, изберете Import Container.

3. Обновяване със собствения Tracking ID. Обновяване или създаване на нов Constant Variable с името {{YOUR_GA_TRACKING_ID}} с вашия Google Analytics Tracking ID (a.k.a. UA Number).

4. Преглед & Публикуване. Използване на опциите за визуализация, за да се тества този контейнер на собствения сайт. Опитайте изпитване на всяка от събитията, за да сте сигурни, че работят правилно. Ако всичко изглежда добре, давай напред и да публикува!

Без Google Tag Manager
Изтегляне на Iscroll tracking script (in a .js file) на собствен сървър. Добавяне на външния .js файл в <head> секцията на всяка страница, на която ще се проследява за четимост на съдържание.

 

Къде може да се използва Scroll Tracking

Scroll Tracking може да се инсталира само на отделни страници в сайта. Например, вече имате страници, които вече проследявате за ангажираност (форму за попълване и публикуване или кликабъл бутони), тогава ще се обезсмисли добавянето и на този плъгин. Инсталирането на Scroll tracking на кратки страници (статии) също няма да е от голямо значение, за подобни страници с кратко съдържание по-добрият вариант е engagement timer tracking.

Използването на Scroll Tracking с други проследяващи механизми е по-ефективно

Самостоятелното използване на този инструмент няма да даде вълшебния резултат за точното ангажиране на потребителя към съдържанието на страницата. Комбинирането му с други инструменти за проследяване на потребителска ангажираност ще даде по-коректни и впечатляващи резултати.

 

Персонализирайте вашият Script!

Скрипта на Scroll Tracking  може да бъде персонализиран! Достатъчно е да промените кода вътре в HTML тага или вашият JavaScript файл, за да получите очакваните резултати.

scroll-customize

 

Честота

Можете да избирате и коригирате честотата на генерирате на резултати на база създадени събития от страна на потребителя. Можете генерирате събития въз основа на точни проценти, като 10% или 90% например. Или да генерирате регулярни резултати на базата на регулярни събития, например на всеки 25%. Ако имате
Освен това, можете да изберете да уволни основава на пиксели, стрелба, когато потребителите преминават определен праг или на определени интервали. Например, ако имате страница с безкрайно превъртане  можете да следите дълбочината на пикселите, за да разберете до къде точно потребителят е превъртял!

 

Горе у Долу

Можете също така да определите горната и долната част на областта, която искате да проследявате при превъртане с помощта на CSS селектори. Това е особено полезно, ако имате по-дълги хедъри или футери, или разширяващи се секции с коментари, които се очаква да продължат да нарастват. Чрез попълване на Top и Bottom, можете да проследявате скрулирането на потребителя в тази конкретна секция.

 

Google ranking

Study said: outbound links affect the ranking of sites in Google

Recently Reboot Online published the results from their research on: How the outbound links affect the position of websites in Google search results, despite contrary statements by the representatives of the search.

What is outbound links?

Outbound links are links that are meant to take you elsewhere. These are links that are going to direct you to another specific webpage or website altogether. There are two kinds of outbound links – dofollow link and nofollow links.

What did they do?

The study covers 10 new domains that have been bought. All of the 10 new domains were registered at one and the same time to eliminate the influence of the age domain factor.

On these news sites several articles have been published. Two highly recommended/respected relevant resources have been mentioned (included) in each one of these articles. As at 5 sites the resources have been mentioned as text only, and at the 5 other sites – as hyperlinks, as follow.

1) aveoningon.co.uk – does not contain outbound links
2) bistuluded.co.uk – contains outbound links
3) chotoilame.co.uk – does not contain outbound links
4) dyeatimide.co.uk – contains outbound links
5) edikatstic.co.uk – does not contain outbound links
6) foppostler.co.uk – contains outbound links
7) gamorcesed.co.uk – does not contain outbound links
8) heabasumel.co.uk – contains outbound links
9) iramebleta.co.uk – does not contain outbound links
10) jundbaramn.co.uk – contains outbound links

All of these sites were designed with a similar, but not identical structure.

Web crawlers access to the domains were blocked up for as long as the content of the sites have not been published. This stem was important to be sure that Google will indexes the content of all websites at one and the same time.

 

Once the web crawlers access have been unlocked and the sites were indexed, Reboot Online staff provided several searches for key phrases and recorded the results as screenshots. The ranking progress were monitored for 5 months.

 

Ranking results

It was not a big surprise that the presence of a outbound links to reputable sites (in the analysed websites) have a positive impact on the site ranking.

Heat map

Heat maps were created based on the ranking results, taken for one day of testing. Green squares indicate that the site held the expected position in SERP Google for the keyword. A red squares (or red gradient) denotes a site held low position. The further away it is from the mid-point, the darker red the square is.

The statistics for the keywords [Phylandocic] и [Ancludixis] have been published:

Keywords [Phylandocic] results

Phylandocic

 

Keyword [Ancludixis] resultsAncludixis

Websites ranking position

Website ranking positions for the same keywords were presented. These diagrams show the positions of sites in the Google rankings. The blue line shows the site position in the Google ranking – for sites contains outbound links. The orange line shows the site position in the Google ranking – for sites do not contain outbound links.

Keywords [Phylandocic] results

Phylandocic

 

Keyword [Ancludixis] resultsAncludixis

 

 

seo effect

Top 10 Free Tools for Optimizing a Website

It’s easy to design and develop a website, but to optimize it for the Search Engines is tough work.

If you Googling about Free Search Engine Optimization Tools, you will definitely find large humber of testing tools, so we present in the report, our top 10 of the most useful, worthful and accurate optimizing tools for webmasters.

1. Google trends

Google Trends is a public web facility of Google Search, that shows how often a particular search-term is entered relative to the total search-volume across various regions of the world, and in various languages.

Google Trends

The Google Trends uses real-time search data to help you understand your consumer search behaviors.

Using Google Trends you can compare searches between two or more terms.

Google Trends helps you to find how do people search for your brand; information about your competitors and partners.

 

2. Google AdWord Keyword Planner

Keyword Planner is a free AdWords tool that helps you build Search Network campaigns by finding keyword ideas and estimating how they may perform.

Google Keyword planner

The Keyword Planner adds help you to identify the keywords and the scope of you campaign. You can search for keywords/ideas on specific terms that describe your campaign (product, service…). You can create multiple lists of keywords that to combine. You can get traffic estimates and historical statistics of the searches.

To use it you need a Google AdWords account.Using AdWords you can reach more customers online and boost your business.

 

3. Robot.txt Tester

The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots.

robot.txt
A robots.txt file is a testing tool in search console for testing the robots.txt file. Using it you can root your site and indicates those parts of the website you don’t want accessed by search engine crawlers.

 

4. HTML/XHTML Validator

 This validator checks the markup validity of Web documents in HTML, XHTML, SMIL, MathML, etc.

HTML XHTML W3C validator
W3C validator allows you to check HTML and XHTML documents for well-formed markup. W3C validation is important for site usability and browser compatibility.

 

5. CSS Validator

The W3C CSS Validation Service is a free software that helps Web designers and Web developers check Cascading Style Sheets (CSS).

W3C CSS validator

W3C CSS Validator allows you to find errors and issues you need to be fix in your CSS file. Using it you can your CSS code and find errors or typos.

 

6. Structured Data Testing Tool

It’s a Google tool that validates and tests structured data in websites.

You can use Structured Data Testing Tool to check if Google can correctly parse  your structured data markup and display it in search result.

 

 

7. Mobile Friendly Tool

It’s a Google tool to analyze a web page and report if the page has a mobile-friendly design and gives optimization recommendations.

Google Mobile Friendly Tool

Using  Mobile Friendly Tool you can see how the Googlebot sees your page and website as a whole. You can identify any errors or issues that need to be fixed.

 

8. Page Speed Tool

The PageSpeed tools analyze and optimize your site following web best practices.

Google Page Speed Tool

Using Page Speed Tool of Google  you can analyze your site performance and to identify the how you can make your site to be faster and mobile-friendly.

 

9. Search Console

Google Search Console is a free service offered by Google that helps you monitor and maintain your site’s presence in Google Search results

Google Search Console

Using Google Search Console you can monitoring your website performance in Google search results – you can check if Google cab access your content, submit new content for crawling, monitor malware or spam and etc.

 

10. Google Analytics

Google Analytics lets you measure your advertising ROI as well as track your Flash, video, and social networking sites and applications.

Google analytics tool

You can use Google Analytics Tool to tracks your website and reports it traffic. You can generate reports and find amazing information and data about your users, content, referral, social channels, traffic sources and more.

 

Google Page Rank

Google has confirmed it is removing Toolbar PageRank

It’s official: Google confirmed they has decided to remove the Toolbar PageRank from its browser.

Google Page Rank

That means that if you are using a tool or a browser that shows you PageRank data from Google, it will begin not to show any data at all.

The visible PageRank has been removed for some of the Google toolbar users.. for the rest of them it will take some time while the removal roll out continues. Probably you have already noticed that your Google PageRank (in the browser) is showing as not available…. or it is going to be not available soon.

Google explained that the they still uses PageRank data internally within the ranking algorithm, only the external PageRank shown in the Toolbar will be completely removed.

Does PageRank removal will change anything on SEO or Site Owners?

Google explain, that there are no changes on how Google is using PageRank, after its removal from the Toolbar.  It will not reflect on the how the sites will show up in the search results.

We can still be able to get information about the site through Search Console (presence, content, pointing, etc). There are lots of other Google tools, toolbars and extensions that continue to help us to understand our rank search result position.

 

 

 

 

 

Panda Google Update

Do we really understand Panda update of Google?

Google Panda is one of the filters that Google uses in determining the rank of pages with low quality content, which in turn rank – up pages with valuable and quality content. Often, however, this algorithm remains misunderstood.

“Panda algorithm, which applies to all sites is the key signal to be taken into account when determining page rank. It measures the quality of a website and more information can be found in these instructions. Panda allows Google to take into account the quality and include it in the overall assessment of the ranking.”

Google

There are many SEO optimizers applying reverse engineering to the Panda algorithm, exploring the “winners” and “losers” sites in each update, but compliance is simply not the equal causation.

There are many myths about the Panda update, which people take for facts and their desire to “fix” the punishments imposed by the algorithm leads to even more problems for the sites. One example of the common wrong practice is the removing of entirely low quality content just to improve the ranking.

As put in the mix the current state of growing slowly update it is harder to identify possible causes of influence from Panda on the sites themselves, and whether there is any at all, or this is just another tool from Google, they changed every year.

So here’s what is known about Google Panda, confirmed by themselves, so that the webmasters affected by the algorithm to recover in time.

The impact of Panda on the content

Rethinking on the general content and Google Panda

By Google recently advised not to remove the contents to prevent Panda. Still some SEO experts disagree with this statement (even quite loud) but there are a lot of sense. Removing content simply because you think that is bad for the site in terms of Panda, can create serious problems with visibility in the search engine.       

 

Remove the wrong content

The biggest problem occurs when the webmasters decided to remove content that could be a value according to Google (ie it brings traffic and good rank). If you do not view any part of the contents and compare it with Google, it is possible to delete the content that performs well, but just more specific and less easily visible theme.

You can use Search Analytics in Google Seach Console, to see which pages have high traffic and determine where there is a need to change the content.

By Google advised to review the searches and where it does not match the content on your pages to make some relevant changes.

As a whole removing the wrong content can have a serious impact on your site because both remove content that Google has considered as good one and wasting traffic from these pages. So it is very easy to think that your site loses traffic due to Panda algorithm or someone else, but actually you are the reason, removing pages.

There is also the possibility another search engine to consider your content for great and also lose traffic, eliminating it.

 

Remove or correct content

As already mentioned, the removal of content is not the best solution. So you should not use the strategy “Remove all”, which is the most often suggestion given you by many experts. Some time ago, the decision was to add new quality content and to change the old, but not erased. If someone was determined to remove content it just being prevent to indexed by using robots.txt or NOINDEX.

In fact, this is even the only measure action proposed by Google in case of low-quality content – it simply should not be indexed, but not to eliminate completely.

Non-indexed content can still be found, but each user personally have to look for it on the site, and therefore you can also track whether these pages create natural flow of traffic on the site. Then you can see if the content is incorrectly noted as weak and should be indexed.

There is an option temporarily to stop indexing of specific pages with poor content and it could be changed. It was recommended by Google back in 2011, before being officially released Panda algorithm.

And with regard to low-quality pages, it is better just to avoid creating such, ie if they do not fall into category pages with unique content, useful for the user because of which they will trust you in the future.

 

When removal of content is necessarily

After all written above, there are still cases in which the removing of content is the only option. For example, if a forum has a glut of spam posts that can not be changed for better.

The same goes for a website that posts the RSS feeds only and if the webmaster does not have time to change all of these published articles in a new and interesting way.

 

Adding new and quality content

Adding new content on the site is a great solution for Panda also. The site always wins new things, even if Panda negatively affect some parts.

 

Affected by Panda sites still get ranking

A common mistake regarding Panda is that if one website is hit by Panda, it did not rank. This of course is not true.

Most people see the sites with terribly poor quality content and think that the whole site is weak. If a website has quality content on some pages, the website continues to be shown in the results.

Google also confirmed:

“Panda update may continue to show such websites in specific and precise requests, but visibility will be limited in search, which will not benefit either the site owner or consumers.”

There is another reason Google to advise against eliminating poor quality content. Eliminating spam is a plus, but if the quality of some pages is low, just add new content and new pages.

 

Is it necessary all pages to have high quality content to be the website good for Panda?

Sites that have pages affected by Panda can still be find in the search results. It is important the majority of the contents to be with high quality. There is no a formula or a special recipe about it.

Quality content improves ranking.

“In general, high-quality sites provide great content to consumers in most of the pages.”

Google

Site with several irrelevant pages can still be seen as a great website, but always – better to improve the content. This is not something that webmasters should be concerned. It is important however these pages to be a few.

Consumer expectations

Do you justify the expectations of consumers? It is important, when your website is ranked according to key words, the content to suit to these words. You may have a great text, but if your pages are shown as a result of requests that do not meet the expectations of people on Google, this is a signal of poor quality content.

“At the end of the day it is not important how many visitors you have at any given time, and how they’ve been helpful.”

Google

Duplicated content

Many people believe that the filter for duplicated content (yes – filter, not punishment) is the heart of Panda, but it is not. The algorithm and duplication are two different things.

However it is good to remove duplicates, especially if there are many in number, because it would lead to interference in your site.

In terms of SEO, duplicates are a lower priority to the providing highest possible quality content. Google said that if you need to prioritize time for a site finally will be cleared of duplicate content. Of course, it is important to optimize, but this factor is not critical. The algorithms  deal with this problem without problem. For example: if you buy a car it is natural to want it to look good, but if there are serious technical failures – then this will be your priority.

In other words, internal duplicated content is not of great importance, especially in the world of WordPress, where less experienced people often duplicate pages. This is only a small part of the algorithmic puzzle, although there is quite a serious role.

 

Similar content

Especially for sites about “How to …” it is important to monitor content, because quite often there are similar pages. Quite often if there are coincidences and similar elements on the sites their quality has dropped significantly.

For example, the site eHow has over 100 articles (maybe even more) on “How to flush the toilet.” Unless the site says uncloggingtoilets.com (otpushvanenatoaletni.com) is desirable some of these pages to be deleted or combined – it is possibly even Google does not submit any traffic to these 100 pages.

Check references to these pages and see which implement traffic because it is a sign that they have enough quality to qualify. But pay attention to those that have none.

 

Error 404 – page not found

One of the most common mistakes – 404 – occurs in case of a problem with crawling or been deleted. This error does not affect the Panda in any way.

 

Aggregated content

Google is not OK  websites with aggregated content, with a few exceptions.

“One thing you really need to consider is that our algorithms focus on unique, compelling and high-quality content. If a site just piled links to other sites, this site has no value, showing in search engine – it will be better for the users  simply to visit a specific site directly. So rather than just focus on technical malfunctions such as crawling and indexing, we recommend you go a step back and consider the model of the site. “

Google

There are exceptions to this rule – Techmeme.com is an example of an aggregator which operates successfully in the network to Google.

 

Look at the site with new eyes

For a webmaster it is important to determine whether the content of a website is really high quality. Sometimes it is necessary someone who is not directly linked to the site just to visit it and share opinion.

Many SEO experts think their website consist amazing  and unique content, whether it is written by the or someone, but sometimes their judgment could be subjective.

 

Number of words and on-page factors

Rethink the number of words

On the one hand it is a good idea to provide content over a certain number of words,  but on the other hand if you have a short content, it is not necessarily to be hit by Panda. In fact, there is a lot of content that Google will not only considers quality one, but will reward it set up its own snippet.

If only the number of words is important, then the game becomes very easy for spammers. So it means that in number of words does not mean “content of n-number of words because Panda”, and simple advice to optimize as it possible those parts of the site.

 

Information snippet on Google

The theory states that content less than 250 words is “weak”, but if that site was removed such content because it is short, the website would not be ranked ahead, and would lose it snippets.

 

Advertising & Partners (Affiliates)

Advertising and affiliate links affect the Panda, but only because divert attention from the main sites content.

The webmasters should not be interested in the number of visitors to a site, but rather how many of them were helpful. This is often at odds with the real situation. Too many sites are trying to get more benefit from the user than the serve the user when he/she went to the site.

Partner sites generally are not the alert and Panda check them harder than ordinary non-affiliate site. Many affiliate sites simply do not create content that Panda approved. The same can be said for pages that simply bother visitors with mass advertising content.

Advertisement

Like Google’s guidelines said about advertising: as long as your advertisement do not intrude on people, there is no Panda problem. There is also an algorithms for the page layout, positioning of ads on the pages and how the  pages is falling from the visible part of the displayed area.

A key point is the way in which the consumers notice the advertisements.

People often see too many ads in the visible part of the contents and it is a prerequisite for punishment by the Panda update, but this issue is deals by another algorithm called TOP HEAVY. TOP HEAVY pursued sites displaying ads very visible in the displayed area that make the user scroll down to see the content.

 

Content generated by users

This type of content is often a matter of discussion. Many SEO experts recommend removing it, because they believe according to Google it deteriorates the quality. But this is far from the truth.

There are a lot of sites where the primary content is generated by consumers, and the rank is great and the site is well situated in the results at Google.

Search engine believes that there is nothing wrong with quality sites with user-generated content and you should not believe on the motto “You have to delete everything written by visitors.”

 

Forums

When talking about forums it is important to pay special attention on it, because the removal of content from blogs, as a preventive measure against Panda, is not the best solution. If this happens, we go to the imposition of censorship and appeared quite another problem. You can have a forum with great content but the user content is not on a good level, but nevertheless there is a chance it to be useful. If you start to remove the content generated by the visitors, they may abandon you.

 

The titles in the forums

It is desirable to give freedom of moderators/admins to change some titles in the forums. Often the topics start with the title “HELP” or “What should I do?” but they are still unclear. Rewriting the title is not just for the search engines, and to be in help of all users. Google want to sends you traffic, but you need to show relevant results for people.

Another excellent recommendation for improvement of titles (type as “Help”) is to add some terms as “decisions” or similar. Even Google rewrites headlines and do not guarantee that will show yours, but your chances increase because doing this you give a clear request that you want to be helpful to people.

You can add it at the beginning or the end of the title – some people do not want to risk by changing the location of keywords and title tag – but do so that it can be seen in search results, especially if the title is longer. This is a thing that works and rank well for many sites. If someone searches for a specific problem and see in the header tag that there is a solution that will automatically win a visitor.

Comments

The comments could be another problematic area in terms of the Panda update. Good comments indicate that a page can be very interesting to the consumer but the bad comments can ruin the reputation of the entire page.

Even the news included in Google news can be dropped, if the algorithm noticed they received too many comments. In this case, the article begins to be copied too much and the value of the original article is dropping down.

Comments definitely can be affected by Panda, depending on their type and how are moderated.

Web service Disqus, for example, has the option the comments to not be part of HTML code of the page. WordPress has an option to put paging of comments, that avoiding the situation of hundreds of poor quality comments that harm the reputation of the entire page.

Comments as a positive signal

Good comments can also be in great help, in terms of the Panda update. If you have quality comments, they enhance the prestige of the page, show the interaction with users and reflect the actual popularity of the content.

 

Spam comments and comments escalated into personal conflict

This type of comments are bad for your your webpage and the website as a whole. If you leave comments to multiply, not moderate them and they are filled with spammers and people who just insult rather than comment on the subject will definitely bring your weak authority for all content.

What do you think the Panda will do  when the see the quality content at the top of the page and then it was followed by lots of spam comments. Is the final decision of the algorithm will be positive or negative?

If comments below the article are poor, it may be best to be blocked. It is very important to follow a strict policy regarding comments. If people start to associate your company with ugly and empty comments, it does not help your business.

 

Comments can affect entire websites

Sometimes even spam comments affect entire sites. Usually it happens with sites that do not offer so quality content. In such a situation the bad comments could affect even pages that have not been commented.

 

Remove all comments

This practice is observed lately from some experts, that recommend removing of all comments as a measure against Panda update.

The decision like this leads to miss some benefits. First is that, if the content is good,  it will not be possible the good comments to be seen and the commitment responsible for the raise of the website prestige. Second, the value of the article will not reach its maximum level, because many users visit again and again the article just to follow the opinions of others.

The absence of any comments also is not good for the site, a user who searches for information on a given subject will not be able to see the reaction of other people interesting of the same topics. Often comments serve to clarify things from the article.

Also the presence of comments can help search engine rankings your website on the keywords.

Loading speed

Many people are still wondering whether loading speed of the website is not a potential signal. But the loading speed of pages is officially recognized by Google as a ranking factor. This is not the strongest signals, as links and content, but still is important.

User engagement

No evidence of direct interaction between user engagement and Panda, although many experts believe the likes and sharing are indicators for quality.

The only possible signal that Panda can deduct is based on the user comments, which is a reason again to think about keep or remove them.

Advantages of TLDs (top-level domains)

Although some TLD domains have more spam than others, Panda does not look different content depending on what kind of TLD has developed. Since Google does not give preference to sites because as .com or .gov.

 

And again:

Remember that Panda deals with the content rather than technical problems. Much of the trouble you have, should be because of technical issues, but no matter how hard you try to correct them, it will not help you.

Simply create a unique and quality content and the Panda punishment will stay away from you.

 

Cure article and publishing

3 малко познати инструмента за подбор и публикуване на съдържание

 

Новите онлайн инструменти, които се появяват и все още не са толкова популярни, променят начина, по който търговци и маркетинг специалисти събират, подбират и предоставят конкретно съдържание на аудиторията си през социалните канали.

Ето и три от тях.

1. Medium

medium

Medium е общност от читатели и писатели, които предлагат уникални перспективи създаване и споделяне на съдържание. Необходима е първоначална регистрация за да започнете да използвате платформата.

Medium е чудесна платформа за маркетинг, тъй като позволява бързо да се разпространява съдържание. Можете да пишете дълги статии, кратки публикации или туитове, както и да качвате, които да споделите с Medium общността. Когато публикуват статии, те биват споделяни от вашите последователи до други членове на мрежата, които търсят и намират съдържание на база поставени тагове и споделени мнения.

Medium също предлага възможност за подбор на съдържание, което много хора пренебрегват. Можете да създадете своя собствена публикация и да подпирате статии или автори от мрежата.

Създаване на публикация
За да създадете публикация, е необходимо да отидете на страницата за публикации (Publications page) и да натиснете New Publication (нова публикация). След това въвеждате детайлите за вашата публикация и я оформяте. Можете да изберете няколко стила на оформление (Grid, Stream, или List) и колко публикации да се появят на началната страница.

За да подберете съдържание, е необходимо да потърсите по ключови за вас думи, в Medium за статии, близки по тема и до посланието, което сте оправили във вашата статия.  След като намерите подходяща статия, която искате да публикувате, е необходимо да кликнете върху иконата с форма на елипса, която ще намерите в края на статията, и избирате Request Story от падащото меню. Това ще ви свърже с автора на статията по имейл и той ще ви отговори дали можете да публикувате статията му или не. Ако получите разрешение публикацията може да бъде добавена към вашата статия.

2. Twitter’s Curator platform

twitter curator platform

Някои от най-популярните Twitter  акаунти не споделят задължително свое съдържание или свои публикации. Те търсят най-добрите статии, туитове, видеоклипове, снимки и истории, свързани с конкретна тема и ги споделяте със своите последователи.

Twitter позволява да подбирате съдържание, да споделяте линкове или туитове публикувани вече от други. Първият подход е особено популярен. При него вие търсите онлайн съдържание и го споделяте.

Вторият подход е да използвате Twitter’s Curator platform, като въвеждате определени ключови думи, хаштагове или индивидуални потребители. На база на вашите критерии за търсене платформата връща резултати най-близки до търсеното от вас и то в реално време. Всеки един от тези резултати, ако решите, можете да споделите и вие до вашите последователи.

3. Clip Slides на SlideShare (Clipping)

Clipping on SlideShare

SlideShare наскоро пусна своя клипинг инструмент, който позволява да изрязвате и запазите най-добрите слайдове от презентации, за да ги разгледате или споделите по-късно. Това е чудесен начин да поддържате предварително подбраното от вас съдържание подредено по теми, което улеснява по-късното му споделяне. уреденото съдържание организиран от темата, така че да може да достави само най-добрите прозрения.

За да започнете, е необходимо да се регистрирате и да влзете в SlideShare , да кликнете върху My Clipboards от лентата за навигация.
На следващата страница кликнете на  Create a Clipboard. Въвеждате името и описанието на клипборда, който искате да създадете, дали да бъде публичен или не. Като името на създаденият от вас клипборд трябва да е в унисон (да е логически свързан) с тематиката на колекцията от слайдове, която създавате.

RankBrain Google

Как RankBrain алгоритъма ще засегне SEO през 2016 г.?

RankBrain Google
Наскоро Google обяви въвеждането на алгоритъм известен като RankBrain, който ще следи за показване на резултатите от търсенето.
В действителност, през последните няколко месеца, вече “значителен процент” от всички запитвания в търсачката на Google са били обработени от самата RankBrain.

Какво в действителност е RankBrain, какво се променя и къде всъщност е проблема за SEO специалистите?

RankBrain е вид изкуствен интелект, който се прилага в момента заедно с настоящия алгоритъм за търсене на Google, за да се осигури по-добро съвпадение на резултатите въз основа на потребителските заявки. RankBrain ще прилага машинно обучение и ще използва математически процеси и модели за разбиране на езиковата семантика на потребителя и постепенно ще научава повече за това как и защо хората търсят и ще прилага тези модели за подобряване на съвпаденията при бъдещи резултати от търсенето.Би следвало RankBrain да се възприема като машина с изкуствен интелект, която ще се самообучава и постепенно ще се самоусъвършенства в метода си за определяне на най-добри съвпадения като резултати при направено търсене от потребителя в Google.

Какво точно прави RankBrain?

Доколкото е известно, това е алгоритъм за машинно обучение, стартиран в началото на тази година. Неговите ефекти постепенно ще бъдат наблюдавани и те ще засегнат търсенето и визуализирането на резултатите в Google, макар и по един фин начин.

Ако още не сте успели да изгледате видеото от Bloomberg, го направете сега.

По-долу са изложени основни хипотези за това как RankBrain се очаква да се отрази на SEO процесите, в дългосрочен аспект.

 

Хипотеза 1: Поведението на потребителя постепенно ще се измества 

Тъй като резултатите от търсенето ще стават все по-точни и по-добри от гледна точка на потребителя, като ще се сведат до първите три резултата – независимо от вида списък – то все още ще е органично търсене и ще доведе до повече кликвания от потребителите върху показаните резултати.

Така вече няма да стои въпроса как да бъдем на първата страница в резултатите, а как да бъдем в първите 3 от списъка.

 

Хипотеза 2: Конкуренцията ще става все по-силна

Това, което RankBrain наистина прави е да приоритизира значимите резултати. “Играта” в показването на резултатите няма да бъде така приятелска за тези, които целят единствено по-голям трафик към сайтовете си. Всички статии, които са по-малко изчерпателни от подбраните първи 10x ще започнат да изпадат назад в класирането. Само най-добрите ще се класират – всички други посредствени резултати ще започнат да изпадат назад.

Търсенето ще бъде игра с нулев резултат. Нещо, което винаги е било така, но ще става още по-трудно.

 

Хипотеза 3: Машинното самообучение на RankBrain ще смаже спама и т.н. практики “черни шапки”

Можем да заключим, че това е основано на дългогодишно натрупани резултати и от части базирано на използван досега алгоритъм в Google. Като се има предвид посоката, в която се развива Google при борбата със спама (чийто ефект се усети доста силно през 2010 г.) е изненада, че RankBrain не е алгоритъм, който е насочен към блокирането на тактиките “черна шапка”.

Може да се преподположи, че ако RankBrain работи по-добре, отколкото са очаквали в Google, те ще използват положителния резултат за справянето със спама и черните шапки.

 

Хипотеза 4: Вие може да влияете на RankBrain

Google “храни” RankBrain с офлайн данни, което означава, че RankBrain не се учи в интернет, такъв какъвто е. Каквото Google сметне, че е достатъчно добро се дава на RankBrain. Така, че разпространяването например на термини като ‘Growth Hacking’ или ‘Inbound Marketing’ или ‘Link Earning’ всъщност може да е сигнал, че сте компетентен в сферата на тези понятия.

Ако това се подаде към RankBrain и той ви разпознае като източник на термина и авторитет по темата, би могло да доведе до положителен сигнал за вашия сайт и всичко, което е свързано с него. Това няма да бъде никак лесно да се прави, но определено е нещо, което би могло да повлияе на алгоритъма.

 

RankBrain и SEO през 2016

 Уеб сайтове, които са с лазерна точност фокусирани на подобни теми ще се радват на по-добър ранкинг и по-добро класиране в резултатите, но те  ще трябва да подобрят качеството на съдържанието (на текстовете) до ниво 10x.

Поддържане на силен фокус и въвеждане на технически подобрения в сайта, като например намаляване скоростта на зареждане,  използване на микроданни, и SSL ще се явява по-важно от всякога. Препратки към други теми или ниши ще стане по-малък приоритет за блогове и новинарски сайтове.

Ще е добре да се правят неща, които никога не умират в SEO. Cyrus Shepard от Moz е казал много точно:

  •  Пишете по-пълно и изчерпателно
  • Отговаряйте на повече въпроси
  • Отговaряйте на въпроса още след основната теза – т.е. на въпроси, които хората все още не са задали
  • Публикувайте достатъчно, за да се превърнете в авторитет по темата
  • Създавайте вечнозелени ресурси сред краткосрочните показвания на страници

Всичко това са все трудни неща за вършене, но следвайки ги вероятно ще успеете да останете в челните места за дълго време.Наистина все още не се знае много RankBrain, но от известното досега може да се изградят някои практични стратегии. Те не е като да са нови и неизползвани досега стратегии – но са такива, които са в основата на това, което винаги е бил Google.

Източник: www.forbes.com

Most Powerful SEO Analytics Tools

Google Analytics

Google Analytics is a freemium web analytics service offered by Google that tracks and reports website traffic. Integrated with Google AdWords, users can review online campaigns by tracking landing page quality and conversions (goals). Goals might include sales, lead generation, viewing a specific page, or downloading a particular file.

Google Analytics‘ approach is to show high-level, dashboard-type data for the casual user, and more in-depth data further into the report set. Google Analytics analysis can identify poorly performing pages with techniques such as funnel visualization, where visitors came from, how long they stayed and their geographical position. It also provides more advanced features, including custom visitor segmentation.

Google Analytics e-commerce reporting can track sales activity and performance. The e-commerce reports shows a site’s transactions, revenue, and many other commerce-related metrics.

 

Google local business

It is about the social aspect of search. This is the place you can connect with customers and industry representatives by adding them in your circles. Also they are available to add you in their cyrcles. Those who have your company in their circles will be able to see updates from your company. You can control of this information as well as updating your feed as you wish.

 

Google places for business

Google Places is a tool that a search engine uses when listing your business. Google Places page allows to control what information Google has and presents to searchers about your business.  Google Places allows to fill in your Places page with description, images, hours of operation, and contact information. That information can be changed in anytime you wish.

 

Google AdWords

Google AdWords is an online advertising service that places advertising copy above, below, or beside the list of search results Google displays for a particular search query, or it displays it on their partner websites. The choice and placement of the ads is based in part on a proprietary determination of the relevance of the search query to the advertising copy.  The AdWords program includes local, national, and international distribution.  Google Adwords Express is a feature aimed at small businesses that reduces the difficulty of managing an ad campaign by automatically managing keywords and ad placement. AdWords Express was previously known as Google Boost. AdWords Express also supports small business that don’t have a website, allowing them to direct customers to their place page.

 

 

SEO Digger

This is a free online tool that helps you optimize your site for Google. With SEO Digger, you can find your competitor’s best keywords and to see top 20 Google rankings of your site.

 

AdCenter Keyword Mutation Tool

The tool is siutable if you want to do both SEO and PPC optimization. Usng it uou can find keywords that have unique spellings and typos, so you can optimize your website.

 

SEO Book’s Firefox Plugin

Free Firefox extension to help you expand your browsers to become an SEO research guru.

 

Yammer Analytics

Yammer is a private social network that helps employees collaborate across departments, locations, and business apps. You can monitor your Yammer using: keyword monitoring, security settings, data export, data retention, and analytics.

SME Inst – Horizon 2020 – Stimulating the innovation potential of SMEs

The SME instrument stimulating the innovation potential of SME.
There are three separate phases and a coaching and mentoring service for
beneficiaries SMEs.

Participants can apply to phase 1 with a view to applying to phase 2 at a later date, or
directly to phase 2. If they successfulyy pass Phase 3 can apply for Phase 3.

Find bellow how SMEs can apply, what do they need to know, examples and more.

 

Author: S. Petkova