wholesale mac makeup SEO Tutorial FAQ

 
Napisz nowy temat   Odpowiedz do tematu    Forum Naukowe Koło Chemików Uniwersytetu Gdańskiego Strona Główna -> Hyde Park
Zobacz poprzedni temat :: Zobacz następny temat  
Autor Wiadomość
asulljca94
Ununokt



Dołączył: 11 Lut 2011
Posty: 454
Przeczytał: 0 tematów

Ostrzeżeń: 0/5
Skąd: England

PostWysłany: Sob 12:43, 19 Lut 2011    Temat postu: wholesale mac makeup SEO Tutorial FAQ

Free newsletter
Subscribe to our free newsletter
(Powered by Google Groups)
SEO Tutorial / FAQ
The essentials of SEO 1. Introduction
1.1. What is SEO?
1.2. Do I need SEO?
1.3. Should I hire someone or make it all myself?
2. Basic concepts
2.1. Search engines
2.2. Terminology
3. Ranking factors
3.1. On-Page ranking factors
3.1.1. Important stuff
3.1.2. Helpful stuff
3.1.3. Useless stuff
3.1.4. Stuff that hurts your rankings
3.1 .5. On-page factors summary
3.2. Off-Page ranking factors
3.2.1. What is it?
3.2.2. PageRank
3.2.3. Important stuff
3.2 .4. Helpful stuff
3.2.5. Useless stuff
3.2.6. Stuff that hurts your rankings
3.2.7. Off-page factors summary
4. SEO strategies
4.1. On-Page SEO strategies
4.2. Off-Page SEO strategies
4.3. Tracking the results
4.4. Help! I've lost my rankings!
5. SEO software and tools
6 . SEO resources that are worth visiting
a) Write for humans, not for search engines! Remember: you offer you products for humans. It is human who reads the texts on your website and decides whether or not he is going to purchase the stuff from you. Yeah, technically speaking, search engines read your site too, but I never heard of a search engine that would buy something.
So you should create a content that is interesting and useful for your human visitors at the first place!
First of all, let me assure you that this is NOT a usual SEO tutorial made mostly of techniques that worked well in 2003 by some guys who didn't try to actually apply these techniques themselves in 2010. This tutorial contains some basic SEO info as well as some more advanced tips and tricks which are actually rather obvious too, but that doesn't make them less important, and it seems that it is their obviousness that makes some people think these tips don't work .
This SEO tutorial is not about p> Table of Contents (SEO FAQ):



1. Introduction One of the buzz-words of the latest 10 years in internet marketing is SEO. Everyone talks about SEO, everyone tries to apply it more or less successfully. If you are experienced in this theme you may skip the first chapters, otherwise read along to learn the very basics.
1.1. What is SEO The term Optimization as the way to find an info on the web appeared in the middle of 90's. They crawled websites and indexed them in their own databases marking them as having one or another keyword in its content. Thus, when someone put some query in the searchbox of that search engine, it quickly searched its database and found which indexed pages corresponded to that query.
So, the more keywords of a query a website had, the higher it was shown in the results of a search. We don't know who was the first guy realized that he can make some changes to the pages of his website to make it rank higher, but he was truly a diamond!
So, SEO is something that helps your site rank better in search engines. There are a number of ways and methods of SEO, some of them are legitimate, while others are restricted and considered as Anyways, we'll thoroughly cover this material later in this SEO FAQ.
Back to Table of Contents
1.2. Do I need SEO? Well, the answer it? But let's think a bit more. Does SEO help ... well .. umm .. say some oil-extracting company to sell their product? Does it help to promote a small local grocery in the neighbourhood of your home owned by an old chinese? Does it help Obama to rule his bureaucrats? Well,[link widoczny dla zalogowanych], I guess you've got the idea. SEO is effective mostly for Internet businesses. Do you have one? Then you need SEO. Otherwise, SEO is only one of the possible channels to spread the word out about your product or service. And not necessarily the best one.
Back to Table of Contents
1.3. Should I hire someone or make it all myself? One of the most frequent unspoken questions is : should I hire a SEO professional or save few bucks and do it myself? There is no one universal answer for all situations, so here are some pros and cons:

ProsCons Hired SEO You don't have to waste your time; You don't have to learn SEO yourself; SEO Pro's can be quite effective. You still need to control a hired SEO yourself; SEOs usually don't give any guarantees and actually you must be very cautious while choosing a SEO to hire; Hired guy may be a SEO professional, but he is not necessarily a proffesional in your theme; Finally, you have to pay this guy. Do it yourself If you want it done right - do it yourself. You are the one who performs all the show, so you know best what is right and what is wrong about it; You really saving some bucks out there; You can constantly monitor the trends and apply changes in your SEO strategy on the fly. You will need to spend some time reading SEO FAQs and tutorials like this one, posting stupid questions on forums and doing other things nubies always do. It doesn't kill, but still takes some time; You can get very little SEO benefits for all of your efforts and time spent. After all,[link widoczny dla zalogowanych], you are not a guru, right?
Back to Table of Contents
2. Basic concepts
2.1. Search engines Before we start talking about search engine optimization we need to understand how search engines work. Basically, each search engine consists of 3 parts: The Crawler (or the spider). This part of a search engine is a simple robot that downloads pages of a website and crawls them for links. Then,[link widoczny dla zalogowanych], it opens and downloads each of those links to crawl (spider) them too. The crawler visits websites periodically to find the changes in their content and modify their rankings accordingly. Depending on the quality of a website and the frequency of its content updates this may happen from say once per month up to several times a day for a high popularity news sites.

The crawler does not rank websites itself. Instead, it simply passes all crawled websites to another search engine module called the indexer.

The Indexer . This module stores all the pages crawled by the spider in a large database called the index. Think of it as the index in a paper book: you find a word and see which pages mention this word. The index is not static, it updates every time the crawler finds a new page or re-crawls the one already presented in the index. Since the volume of the index is very large it often takes time to commit all the changes into the database. So one may say that a website has been crawled,[link widoczny dla zalogowanych], but not yet indexed.

Once the website with all its content is added to the index, the third part of the search engine begins to work.

The ranker (or search engine software). This part interacts with user and asks for a search query. Then it sifts millions of indexed pages and finds all of them that are relevant to that query. The results get sorted by relevance and finally are shown to a user.

What is relevance and how would one determine if a page is more or less relevant to a query? Here comes the tricky part - the ranking factors ...
Back to Table of Contents
2.2. Terminology Here are the basic terms you need to know. All others will be explained along the way.

Anchor text
This is simply a text of a link. Let suppose you have a link like that:

[url=a]
The text anchor text of a link meets the theme of that page. If your page is about dogs, do not link to it with the make all links within your own website have an appropriate anchor text.

Inbound link
... or backlink is a link that points to your site. The more you have - the better. But in particular there are many exclusions from this rule, so read the Off-Page optimization section to learn more.

Keyword
One or more words describing the theme of a website or page. In fact, we should distinguish keyWORDS and keyPHRASES, but in SEO practice they all called keywords. For instance, the keywords for this page are: SEO FAQ, SEO tutorial, etc.

Short-tail and long-tail keywords
Easy one. Short -tail keywords are some general, common words and phrases like new york the coin is: since each query is highly targeted, then once a visitor comes to your website from a search engine query and finds what he is looking for - it is very likely that such visitor will soon become a customer. This part is very important ! Long-tail queries are not very popular, but the conversion rate for such queries is much much greater than for short-tail ones.



SERPs
You may heard this term, but didn't understand what is it. SERP means the results shown in the first positions get much more visitors than the ones from page # 2-3 and lower. This is the purpose of SEO, actually: make a website move higher in SERPs.

Snippet
This is a short description shown by a search engine in the SERP listings. The snippet is often taken from a Meta Description tag, or it can be created by a search engine automatically basing on the content of a page.

Landing page
Landing page is a page opened when a visitor comes to the site clicking to a SERP. Here is an example query:



In this case, the page [link widoczny dla zalogowanych] .com / en / google-monitor-query.htm is a landing page for the of a link between them. To be precise: the linked page (acceptor) gets a link juice from the linking page (donor). The more link juice flows into a page, the higher it is ranked. Let's imagine a page that is worth $ 10 - this is the value of that page. If a page has 2 links, each one costs $ 5 then - that is the amount of link juice passed to the linked page. If the first page has 5 links, then each one only passes $ 2 of the initial link juice. Here is a simple picture to illustrate this concept:


Each link passes $ 5 value


Each link passes only $ 2 value


This means, the more links a Page A has, the less value each linked Page B gains from that Page A. Obviously, the real link juice value is not measured in dollars.

Nofollow links
Nofollow link is a link that a search engine should not follow. To make a link nofollow you need the below code:

SEO guide, SEO FAQ, SEO tutorial, best seo faq, seo techniques, seo strategy guide and so on. This would be the keyword stuffing. Instead, the current title of this page (the one you're reading now) looks quite natural and adequately describes its contents. Do not use the keyword stuffing as a) it does not work; b) it is a bad practice that can hurt your rankings.

Robots.txt
robots.txt is a file intended to tell search engine spiders whether or not they are allowed to crawl the content of the site. It is a simple txt file placed in the root folder of your website. Here are some examples:

This one blocks the entire site for GoogleBot:
User-agent: Googlebot
Disallow: /

This one blocks all files withing a single folder except myfile.html for all crawlers:
User-agent: *
Disallow: / folder1 /
Allow: / folder1/myfile.html



Back to Table of Contents
3. Ranking factors In general, there are only two groups of them: on -page and off-page ranking factors. It's been argued which one is the most important, but we'll answer that question later in this FAQ. At this time you should understand that both are crucial and both need the proper attention.
3.1. On-Page ranking factors There are many on-page ranking factors and even more has been spoken of since the first days of SEO. Some of them are really important, while others are said to be crucial for SEO, but actually are useless or even hurt your rankings. You know, search engines are evolving, they change their algos, and something that used to work in 2003 now has become a piece of useless garbage. So, here is the list of on-page ranking factors sorted by their importance and SEO value.
3.1.1. Important stuff Title
This one seems to be one the most important on-page factors. You should pay a close attention to the title tag. Here are some tips on writing a good title:
Content
The next important factor is the content of a page which seems pretty naive at the first glance, right? Wrong! Content is the king, as SEOs like to repeat. The quality content not only describes your product or service, it also converts your visitors to your customers and customers to returning customers. The quality content increases your ranking in search engines as they like a quality content. Moreover, the quality content even helps you get more inbound links to your website (see off-page ranking factors below)!

Basic tips for content are:
Navigation and internal linking
Again an important ranking factor. It seems obvious to create a proper navigation so the search engine crawler could follow all the links on a website and indexed all of its pages then. However, this factor is still being highly underestimated. Creating a clear and easy plain-text navigation helps both search engines and human visitors.

Avoid using JavaScript or Flash links since they are hard to read by search engines. Always provide an alternative way to open any page at your website with simple text links. Do have a sitemap of your website available from any other page with one click .

Also keep in mind that quality internal linking spreads the link juice across the pages of your website, and this strongly helps your landing pages rank better in SERP for long-tail keywords. Use this wisely, though. Link only to pages that really need to be linked to.

Let's suppose you have two pages: one generates you $ 10 income for every visitor, while other one does only $ 0.1. Which one whould you link first? Think of it that way and link to the most important and valuable pages of your website, using a relevant anchor text for each link.
Back to Table of Contents


3.1.2. Helpful stuff The below factors and techniques are not as crucial as the ones described above, but still they help a bit in gaining a higher rank in SERPs. Headings
Once upon a time search engines paid a close attention to the heading tags (H1 through H6), but now those days are gone. Heading tags are easily manipulated, so their value is not very high nowadays. Nevertheless, you still want to use headings to mark the beginning of a text, to split an article into parts, to organize sections and sub -sections within your document. In other words, despite headers provide merely a small SEO value, they are still crucial for making your texts easiliy readable by human visitors.

Use the H1 tag for the main heading of the page , then the H2 for the article headings and the H3 to split the different parts of an article with sub-headers. That would be pretty good practice and is enough to make your site readable by humans. It also adds some SEO points which you should not neglect too.

Bold / Strong and Italic / Emphasized text
Both are nearly useless, but still have some SEO value (very little though). As with headers, you better use them for the benefit of your human visitors, emphasizing the key parts of the text. But do not put every 5th keyword in a bold text as it looks ugly while not giving any significant boost to your rankings anyway. Moreover, such page would be very hard to read.

Keyword placement
The value of keywords in a text depends on their placement across the page. Keywords placed near the top of the document get higher value than ones residing near the bottom. Important: when I say top or bottom I mean the source of the HTML document, not its visual appearance. That is why you want to put your navigation and supplemental texts near the bottom of the source file and all important and relevant content - near the top.

This rule also works in more specific cases: keywords placed in the beginning of the title tag are more important than ones placed 4th or 5th. Keywords placed in the beginning of the anchor text are more important and get more value too.

Keywords in filenames and domain name
An old trick with putting your target keywords into a filename or having them in a domain name. Still works, but don't expect too much boost from this one.
Image Alt attribute
This one was very popular in 2003, but now keyword stuffing of the Alt attribute does not give any SEO value to a page. The better use of the Alt attribute would be something like this: [img]some-pic.gif[/img]
Meta Description
One of the most popular and steady myths (alongside with keyword density) is the Meta Description tag. They say it helps you rank better. They say it is crucial to have it filled with the apropriate description of a page content. They say you must have it on each page of your website. All of these is not true. Nowadays, the only way the Meta Description is used by search engines is taking its content to create a snippet for the SERP. That is all! You don't get any other benefits of using the Meta D on your page, neither do you fall upon any penalty for not using it.

There is an opposite opinion suggesting not to use the Meta D at all, since a search engine anyway creates a snippet basing on the content of a page and you can't make this work better than a search engine. So why waste your time doing that? Personally, I would not agree with this point, since according to Google guidelines the Meta Description tag is still the preferred source of the info for a snippet. Though it is up to you decide whether you want it on your page or not, since as stated above it doesn't give any additional SEO impact, neither positive, nor negative.
Back to Table of Contents
3.1.3. Useless stuff ( no pain, but no gain as well) Meta Keywords
Long time ago the But why this value is useless? Because search engines has evolved and does not count on keyword density anymore, since it is very easily manipulated. There are thousands of factors that search engines consider when calculating the page rank, so why would they need such simple (not to say primitive) way to rank pages as to count the number of times a word appears in the page text? You may hear the keyword density of 6% is the best rate, or keep it within 7% to 10%, or search engines like kw density within 3% to 7% and other bullshit. The truth is ...

Search engines like pages written in a natural language. Write for humans, not for search engines! A page can have any keyword density from 0% (no keyword on a page at all) to 100% (a page consisting of only one word) and still rank high.

Well, of course you may want to control the keyword density of your pages, but please consider that there is no good value for this factor. Any value will work if your text is written with a human reader in mind. Why would one still want to check for keyword density if it is not count any more? Because it is a quick and dirty way to estimate the theme of a page. Simply do not overestimate this thing, it is merely a number, nothing more and it is useless for SEO.

Another interesting question: why this myth is still alive and why there are so many people still talking about keyword desnity as an important ranking factor? Perhaps, because keyword density is easy to understand and modify if needed. You can see it right here with your naked eye and quickly learn if your site is going good or bad. Well, it only seems as that, but not actually is - keyword density is useless, remember?

Dynamic URLs vs. static URLs
Beleive me or not, there is no difference. Both are of the same SEO value. The days when search engines had difficulties indexing dynamic URL websites are gone for good.

[link widoczny dla zalogowanych] vs. site.com
No difference either. If you want your site to be accessed with both ways, please add something like this into your. htaccess file:

RewriteEngine on
RewriteCond% {HTTP_HOST} ^ domain.com
RewriteRule ( .*) [link widoczny dla zalogowanych] $ 1 [R = 301, L]


Underscore vs. hyphen in URLs
Once again, there is no any difference from the SEO point. You can use underscore, or hyphen, or even don't use any separator at all - this neither helps, nor hurts your position in SERPs.

Subfolders
Is it better to have a / red -small-cheap-widget.php file rather than / widgets / red / small / cheap / index.php? Does it hurt your rank if you put the content deep into the subfolders? The answer is no, it won't hurt your rankings and actually it doesn't matter at all how deep in the folder tree a file is located. What matters is how many clicks it takes to reach that file from the homepage.

If you can reach that file in one click - it certainly is more important and would have more weight than say some other file located within 5 clicks away from the index page. The homepage usually has many link juice to share, so the pages it directly links to are obviously more important than others (well, since they receive more link juice, that is).

W3C validation
W3C is World Wide Web Consortium - an international consortium where Member organizations, a full-time staff, and the public work together to develop Web standards. Basically speaking, they are guys who invented HTML, CSS, SOAP, XML and other web technologies.

Validation is the process of checking a page or website for its compliance with W3C standards. You can run a validation of any website for free here. Note, this validator shows not only such trivial things like unclosed quotation, undefined tags or wrong attribute values. It also checks the encoding problems, the compliance with the specified DOCTYPE, obsolete tags and attributes and many more.

Why is validation needed? A 100% valid website ensures that it will display correctly (and identically!) in all browsers that support standards. Unfortunately, in real life some browsers do not strictly follow the W3C standards, so a variety of different cross-browser problems with the number of websites are not rare thing all over the web. This doesn't belittles the importance of W3C standards, however.

From the SEO point the validation doesn't look so crucial though. Run a validation through google.com and you'll see a bunch of warnings and errors on their website. This example pretty clearly shows that Google doesn't care of W3C validation itself. At least not as much to give a strong rank boost to valid websites or penalize erroneous ones. It simply doesn't care. The recommended W3C validation strategy is: perform it to make your site working and accessible with all common browsers and don't bother doing it for the SEO purposes only, if you don't experience any cross-browser issues - it works fine as it is.


Back to Table of Contents
3.1.4. Stuff that hurts your rankings Keyword stuffing
Google defines that term pretty clear. Once again: write for humans. Repeating keywords across the page can trigger Google spam filter and this will result in huge loss of positions if not total ban of your website. Write naturally, optimize a bit if needed - that's the best way of using keywords nowadays.

Hidden text / Invisible links
At first, let's see what Google says about hidden text. Obviously, Google doesn't like it and if your site uses such technique it may be excluded from Google's index. You may ask, how would Google know if I use hidden text or not? Ok, I can set that CSS file with my robots.txt. Will Google be able to learn that a page has a hidden text then? Yes and no. This might work in the short term, but in the long run your disguise will fail, sooner or later. Also, it's been reported that GoogleBot not always strictly follows the robots.txt instructions, so it actually can read and parse JS and CSS without any problems and once it does - the consequences for your website and its web rankings will be disastrous.

Doorway pages
As bad as some SEO method could ever be. The doorway pages are special landing pages created for the only sake of obtaining good positions for some particular keyword. It doesn't have any valuable content and its only purpose is to catch the visitor from the SERP and redirect him to some other, non-doorway page which by the way is usually absolutely irrelevant to the initial visitor's query.

Splogs
Splogs (derivative from Spam Blogs) is the modern version of the old-evil doorways. The technique was as follows: one created thousands of blogs on some free blog service like blogspot.com, linked them between each other and obtained some backlinks via the blog comment spam and other blackhat methods (see below). Splogs itself did not contain any unique information, their content was always automatically generated articles stuffed with keywords, however due to a large number of inbound links such splogs ranked very well in SERPs dislodging many legitimate blogs. Later, Google implemented some filters to protect itself from the large amount of splogs and now any splog gets banned pretty fast.

If you own a blog - do not make it spammy. Instead focus your attention on writing good and interesting content . This works better in fact.

Cloaking
Not as bad in some particular cases, but still a blackhat technique. The method is based on determining whether a visitor is a human or search engine spider and then deciding which content to show. Humans then get one variant of the website while search engines get another one, stuffed with keywords.

Duplicate content
Being a scarecrow for many webmasters, duplicate content is not actually as dangerous as it is spoken. There are two types of content that can be called duplicate. The first case is when a website has several different ways to access the same page, for instance:

[link widoczny dla zalogowanych] com /
[link widoczny dla zalogowanych]
[link widoczny dla zalogowanych]
[link widoczny dla zalogowanych]
etc.

All four refer to the same page, but actually are treated as different pages having the same content. This type of duplicate content issue is easiliy resolved by Google itself and does not lead to any penalty from Google.

The other type is duplicate content on different domain names. A content of a website is considered duplicate if it doesn't add any value to the original content. That is if you simply copy-paste an article to your site - it is a duplicate content. If you copy-paste an article and add some comments or review it from your point of view - that's not duplicate content. The key feature here is some added value. If a site adds value to the initial information - it is not duplicate.

There are two other moments here that are worth to be mentioned. First, if someone copies your text and posts it then on another site - it is very unlikely that you will be penalized for that. Google tracks the age of each page and tends to consider the older one - and it is your website in this case - as the source of the original text. Second, you still can borrow the materials from other websites without a significant risk of being penalized for duplicate content by simply re-writing the text with your own words. There is a way to produce unique random texts using Markov chains, synonymizers and other methods, but I would not recommend using them, since the output looks too spammy


Post został pochwalony 0 razy
Powrót do góry
Zobacz profil autora
Wyświetl posty z ostatnich:   
Napisz nowy temat   Odpowiedz do tematu    Forum Naukowe Koło Chemików Uniwersytetu Gdańskiego Strona Główna -> Hyde Park Wszystkie czasy w strefie EET (Europa)
Strona 1 z 1

 
Skocz do:  
Możesz pisać nowe tematy
Możesz odpowiadać w tematach
Nie możesz zmieniać swoich postów
Nie możesz usuwać swoich postów
Nie możesz głosować w ankietach


Bluetab template design by FF8Jake of FFD
fora.pl - załóż własne forum dyskusyjne za darmo
Powered by phpBB © 2001, 2002 phpBB Group
Regulamin