Current location - Loan Platform Complete Network - Big data management - How to collect and analyze SEO data in the era of big data
How to collect and analyze SEO data in the era of big data
How to collect and analyze SEO data in the era of big data

In this day and age when everyone is shouting "Big Data Era", data seems to be mentioned at an unprecedented level. Whether it is an individual webmaster or large and medium-sized companies, or large multinational groups, whether it is network marketing or offline marketing are aware of the importance of data, everything is to data to speak. However, according to the author's understanding, in many small and medium-sized companies and individual webmasters, the importance of data is more than enough, but the use of insufficient.

Many people are not clear what kind of data need to be collected; some are not clear through what channels to collect data; there are also most of the data is not clear how to analyze the collection of collation, and then it is not clear how to utilize these data. Therefore, a lot of data is just numbers, can not be converted and serve the interests of the company, has become a gorgeous furnishings or chicken ribs. First of all, let's say three types of data into the type of decoration: 1, the importance of data but not clear how to collect, this is the "data" type. Data in a fuzzy understanding of the state, as living in this information explosion era, the importance of various kinds of data preaching, naturally attach importance to data, know that the company and the enterprise to do things and plans to rely on data to support. But because there is no professional data staff, their own companies (or individual webmasters) what data to do, through what channels to collect and organize, can be described as half-knowledge. The last may be through brainstorming and the so-called tutorials online than gourd drawing, plus consulting with peers, put together by the data, so that the data is really just a natural arrangement. 2, understand the required data but the source is not standardized, which is the "wrong data" type. Understanding of the data to understand more, due to the Internet or the company has been playing around for many years, for their own reasons and purposes probably know what data is needed. But also due to the lack of professional data personnel, the source and production of data is not standardized, data collection may also have errors. Therefore, these data may be distorted, the utilization value is naturally not great. In fact, this type of data than the first category has become more of a pose. 3, will do the data but will not be interpreted and analyzed, which is the "cheap data" type. Have a clear understanding of the data, and accurate data sources and clearer data needs, but it is the same as into the treasure mountain and empty back, sitting on a gold mine but will not be utilized, it is not these data can bring real money to the cheap? Simply collect and organize the data to form a visual report, but only these data and can explain what the problem is. What is the meaning behind the data, how to interpret the data to create value for the company and individuals, how to use the data to avoid possible risks, how to use the data to analyze the problem? These are the real value of data. Said a little more, in fact, the author today is mainly about network marketing website SEO data collection and analysis. sem and other media marketing basically have a more mature data collation and analysis model, the author will no longer be ashamed to repeat. The following is just a more popular data model. 1, what data. SEO data should require three aspects: ① their own and competitors' websites external statistical query data: this part of the data can be external webmaster tools comprehensive query. Mainly include, but not limited to: website URL, snapshot date, domain name age, website response time, the same IP site, pr value, Baidu weight, each search engine included, each search engine backlinks, 24 hours Baidu included, Baidu ranking words, the expected Baidu traffic, the number of external links, title, meta tags, server information. These data in addition to apply to the home page, can also be used to query the inner page data properly. These data can be made into an excel table for regular queries, according to the actual needs of the query to increase or decrease the relevant data. Query cycle can be daily, weekly or monthly, etc., in accordance with actual demand and specific circumstances. ② website traffic statistics now most of the companies and webmasters of the site traffic are using traffic statistics tools, greatly facilitating the work of SEO-related personnel statistics and organize data. Currently more professional statistical tools are CNZZ, 51la and Baidu statistics. In terms of professionalism, CNZZ is better, on the accuracy and sensitivity of Baidu traffic, I think Baidu statistics is not bad. To cut the chase, traffic data mainly include but not limited to: IP, PV, independent visitors, per capita views, the average length of visit, bounce rate, visited pages and domain names, source, search engine ratio, search keywords, visitor details, time period analysis is also recommended to be made into an excel table for regular inquiries, according to the actual needs of the query to increase or decrease the relevant data. The query period can be daily, weekly or monthly, etc., according to the actual needs and specific circumstances.

③Monitoring keyword data

Keyword monitoring is relatively simple, there is nothing to say, but it is recommended that the keywords are categorized to monitor the summary. This includes but is not limited to: the main keywords, the main long-tailed words, important traffic words, brand words, the same is recommended to be made into an excel table for regular query, according to the actual needs of the query to increase or decrease the relevant data. The query period can be daily, weekly or monthly, according to the actual demand and specific circumstances.

2, through what channels to collect data the Internet era is also a tool instead of the age of manual labor, with tools to do things fast and convenient, why not. ① their own and competitors' websites external statistical query data. Since the external query, the general webmaster tools can go to the query, I prefer to love the station and the home of the webmaster of these two online query site. Especially the home of the webmaster in the data to do more professional. ② website traffic statistics. Traffic statistics tools have been rich in functionality, and mainstream cnzz, 51la, etc. have data download function. ③ can monitor keyword data. This is if the individual webmaster keyword volume is relatively small, then manually in the search engine and background traffic statistics to go a little bit to verify the query is more accurate. If the bulk keyword query, it is best to use the tool to query, but the current keyword ranking software in the bulk query will generally appear error, if the company has the ability to develop or write their own program software for this type of function. 3, how to analyze the collection of data collected and collated half of ninety success, the hard work of observing the collection of data through a variety of channels, the most essential and most valuable place to someone to see, and To see, through these data for their own sites to get some enlightenment, and play it out for their own to create certain benefits. (1) their own and competitors' websites external statistical query data. These data analysis is as a SEO to analyze their own sites and competitors are the most commonly used and the most basic ability. Through these data (after a certain period of observation can be plotted as a trend chart) can be a clearer understanding of their own websites and competitors' website optimization and weight performance in the search engine. The author briefly describes how to interpret these data.

Baidu snapshots: a website snapshot of the new, at least, to prove that the content of a website every day, fresh, Baidu spiders crawl update is also more frequent, in other words, snapshots are Baidu spiders on the site's recognition. Domain name age: the industry generally agreed that, under the same conditions, the older the domain name in the search engine to obtain the weight of the relatively higher. Response time: This reflects the good or bad performance of the website's server. The larger the response value, the worse the server performance, of course, whether for the user experience or search engine is extremely negative impact. The same IP site: you can see how many sites under the IP, you can roughly distinguish the site owner is to choose the site hosting or buy a separate IP, if it is a separate IP, by the way, you can see what other sites the owner, and then check the other sites, know your enemy.PR value: This is the official Google website recognition and weight given to the outside world to understand the specific value of the embodiment. Although the PR value is now more and more diluted, but as a measure of the site's strengths and weaknesses of a reflection of the standard, still has a reference value. Baidu weight: This is a third-party webmaster tool based on its own computing system to speculate on the performance of the site in the Baidu weight of a numerical value, and has not been officially recognized by Baidu. However, as a reference for webmasters to measure the performance of a website in Baidu, it has reference value for webmasters. Backlinks: Most of the search engine backlinks queried by webmaster tools are not very accurate, especially the Baidu backlinks, the query command yields very poor results, the Baidu backlinks value is actually just a query of the domain name of the relevant domain search results. In any case, for understanding their own way of external links and looking to understand the competitor's external linking techniques also has a reference significance. Inclusion: The total inclusion of each search engine reflects the performance of the website in each search engine. If you know the total number of pages of the website, you can also judge more clearly the situation of the website being included in each search engine, so as to analyze whether there are problems with the website and what problems exist. Daily inclusion / 24-hour inclusion: reflecting the degree of the site by the search engine spiders preferred degree and the degree of optimization of the site links. Ranking words: by checking their own and competitors' website ranking words, you can look for the gap between the optimization of the site, and then check the corresponding page optimization of these ranking keywords. meta tags: check the website of the page title, description, keywords is how to write, especially to check the competitors. Analyze why it is written in this way, will learn more.

②Website traffic statistics of their own accurate website traffic statistics can allow the webmaster to get more understanding of the site. See the site's current optimization, and can provide a good reference for the site's future operations. Traffic analysis is often not a single, is a combination of a variety of values for analysis and judgment. This piece of analysis is also the most complex. IP: analysis is often carried out by date comparison, such as this Wednesday and last Wednesday, the first half of this month and the first half of last month. By analyzing and looking at the changes in traffic, you can see how the site has changed recently. Of course there are some other factors to consider, such as weather, holidays, keyword rankings, website server downtime, news events, etc. PV: values are often compared with the bounce rate and IP, so as to determine the site's user experience and user stickiness. uv: the number of unique visitors, which can reflect how many computers there are, and may be close to the number of real people who are visiting the site. Per capita browsing, average visit length, bounce rate: the ratio of IP to PV, reflecting the website user experience is good or bad. Visited domains and pages: it can be seen which pages of the website are more popular as well as the weight performance in the search engine. Source: visitors are through what channels to the site, so as to determine the site's audience, and then further analyze the audience-related attributes, you can be more clear about the site's target population and the implementation of the site's operational strategy. Keywords: Users are searching for what kind of keywords to come to the website, for the website layout keywords and find keyword optimization is a good way. Visitor Attributes: Through the analysis of visitors' attributes such as geography, education, browser, network access provider, operating system, terminal type, etc., we can understand the situation of the website users in more detail, which will provide reference for the optimization and operation of the website in the future. Hotspot map: This hotspot map function allows webmasters to see how the content of the page is clicked by users, reflecting the user experience of the website page as well as providing a reference for the improvement of the page content.

There are a number of them will not be introduced. ③ can monitor the keyword data relatively speaking this piece of data analysis is relatively simple, through the classification of keywords, and then query the search engine rankings, and then compare and analyze the keywords brought about by the transformation, you can see the optimization situation. Which still need to be strengthened, which need to be maintained, which word high ranking but did not bring substantial meaning, and then adjust the website optimization strategy. At the same time through the keywords to bring the traffic and conversion, but also can compare and analyze other traffic contribution to the conversion, and then for the entire website operation direction and company budget to make reference. Note: The author of the above website seo data collection and analysis process for most of the small and medium-sized companies and individual webmasters, and due to limited energy, the introduction of the content is relatively simple, I hope you forgive me. Postscript: on the "big data era SEO data how to collect and analyze" a few points before writing a "big data era SEO data how to collect and analyze", due to the content written more, and many of the contents can be summarized separately out of an article, fusion in an article is inevitably not enough detailed narrative. In order to avoid the length is too long to affect the reading, the author in the personal blog is published in two, "seo data how to standardize the collection and organization" and "website seo data how to analyze", in addition to published in the personal blog, the complete piece of integration published in the moonlight blog, the title has not been changed. Originally for the website seo data organization and analysis of a standardized description of the role, may be due to my limited expression, resulting in a lot of netizens misunderstanding. I hereby declare the following points to correct: 1. The focus of the article is not "big data". In order to avoid misunderstandings, in the article repeatedly emphasized that it is for small and medium-sized enterprises seo data organization and analysis to provide reference, in the beginning of the expression "First of all, this article in front of the data masters, are a little redundant, are pediatrics, so the masters do not waste your time." Maybe the title does have a bit of title party meaning, talk about "big data", but as the majority of domestic small and medium-sized enterprises, big data and CloudComputing is difficult to reflect in the company, but with the arrival of big data and CloudComputing era, even small and medium-sized enterprises, especially Internet companies, have been able to realize the importance of data management. Even small and medium-sized enterprises, especially Internet companies, will be affected. The author believes that the core of big data is not dead data, but the ability to analyze and predict the data, so the core of this paper is also in the organization and analysis of data, not to talk about unrealistic big data for small and medium-sized enterprises, not to mention what big data analysis. If it is not a multinational group and large enterprises, can not produce massive amounts of data, please do not talk about what big data, will only be misleading, not to mention the superstitious big data 2.2. The content of the article can not be detailed due to limited space. I have given a statement at the end of the article, limited to the length and personal energy can not elaborate seo data collection and analysis work, some of the contents of the introduction is relatively simple, and I do not intend to write a tutorial. Of course, these contents are all personal experience, may be limited to seo level is somewhat narrow, but it is a personal original, as for what copy and paste, or just explain some of the terms, then I have nothing to say. I believe that the drums do not need to hammer, there is no need to hand write a teaching program article, which is written to have a certain foundation of SEOer and marketing team to see.3. Why to collect seo corresponding data article has been explained. Many netizens read the article to ask me why I want to collect those data or ask what exactly to collect which seo data, in fact, although limited to the length of the article, but I still roughly listed the need to collect seo data as well as an explanation of why these data, in how to analyze the collection of data in this paragraph in fact, not only how to analyze the content of the data, but also briefly explains why these data, because know how to look at these data, and why the data, and why the data, and why it is important for the SEO team to know how to analyze the data. data, because know how to look at these data to understand why to collect these data. 4. excel table just play a simple description, not a real case study. In order to match the description of seo data organization and analysis, I had to temporarily create a few simple excel table, also limited to the length of the sake of the detailed description or to provide cases make the article seem longer, had to give up. To say again please forgive us for not providing cases, excel table is just a simple description, and no reference value. 5. This article focuses on ideas, rather than examples of operational sharing. Many users said in the theory of empty talk, no substantive things. Complaints of this kind I do not go to explain, because more than a layman. Still apply the old saying: the drum does not need to hammer. This article is just to introduce a collection and analysis of ideas, as well as a simple process and standardized instructions. Those who want to see a hand-on program will be disappointed, because there is no so-called dry goods, because it is not. Most of my articles are in the sharing of ideas and strategies about the network marketing experience, rarely talk about specific techniques and hands-on teaching program operation. Because I am convinced that the granting of fish is not as good as the granting of fish, the same operating methods and case skills are not necessarily suitable for another site, but the idea of looking at the problem of dealing with things is worth sharing and spreading the strategy.