Showing posts with label google seo. Show all posts
Showing posts with label google seo. Show all posts

Monday, October 19, 2009

7 Tricks to Get a Google of Links

SEO is a race. And in any race learning from your competitors makes you a better runner. Even when you're running first it's sometimes good to look back and check the runner-ups. And if you're not the yellow jersey guy, you absolutely should examine the leaders: their gear, their training, their strategy. In SEO the most interesting thing about your competition are their links.


Whether you like it or not SEO is still pretty much about links. Good link profile can make up for almost any lack of optimized content and other onpage flaws. Love or hate, the best thing you can do about it is embrace the fact and run with it.


So let's go through some tricks that will enable you to look deeper into your competition's link profile granting you access to the restricted areas: their locker room, dirty laundry and even the briefing hall where they plan their link building strategies.
[Track Your Keyword Rank History on Google, Yahoo and Bing!]


Let's Talk Competitive Link Research


Finding out where your competitors' links come from is not all that hard. You just go to Yahoo! or Google and type in link:www.your-competitor.com to get a list of inbound links to the site.


Yahoo's much better in that respect as it tends to give more extensive and accurate data. The problem here is that there's a limit of 1,000 links per website which is often not enough as the fattest link sources get left behind the limit fence. Here're some tips to break through to the other side.


Note: If you're lazy like me skip to the end of the article where I'll share a tool that does it all much quicker.


Trick 1: Search for Links to Particular Web Pages of a Competing Site


Alongside with link:www.your-competitor.com search for


link:www.your-competitor.com/products.html or
link:www.your-competitor.com/services.html


and so on.


Trick 2: Exclude Internal Links


You may examine the internal linking structure of your competition if you want to gain some insight on their navigation and marketing steps. But as we want to find more external links, let's exclude the internal ones.


You can do this by adding -site:site.com operator to your search query. Type in:


link:http://www.your-competitor.com -site:your-competitor.com or
linkdomain:www.your-competitor.com -site:your-competitor.com


and you'll get a list of external backlinks only.


There's a dropdown option in Yahoo! site explorer that does the same.


Trick 3: Exclude Links Coming from Certain Domains


The -site: modifier lets you exclude links coming from specific sites. So, whenever you see a large chunk of links coming from the same domain add -site:thisdomain.com modifier to your query and the links from this site will get replaced with new ones.
Top Positions in the Web's Largest Article Directory


You can add -site: multiple times in one query so that you have something like this:


link:http://www.cnn.com -site:cnn.com -site:en.wikipedia.org


Trick 4: Check Links Coming from Certain TLDs


This is a little known trick. The site: modifier actually lets you get a list of links coming from domains with certain TLDs: .com, .org, .edu, .co.uk and so on. Just type in


link:http://www.your-competitor.com site:.gov or
linkdomain:www.your-competitor.com site:.gov


and you'll get a list of .gov sites linking to your rival.


Note: Do this in Yahoo! regular search, not site explorer


Trick 5: Exclude Links Coming from Certain TLDs


This is an even lesser known trick. You can exclude certain TLDs from the results with the -site:.tld modifier. Usually the biggest chunk of links comes from .com's so add a -site.com modifier and you'll get lots of new link data.
[Forget Expensive PPC Advertising - There is an Alternative!]


Trick 6: Use Different Combinations of the First 5 Tricks


Try link:http://www.your-competitor.com/page.html -site:your-competitor.com -site:.com
Or link:http://www.your-competitor.com site:.org -site:wikipedia.org


Give it a thought and I'm sure you'll come up with lots of ideas. Feel free to share your findings in the comments.


Trick 7: Use the Above 6 Tricks in Different Search Engines


Don't limit your searches to Yahoo! and Google, go to AltaVista, Alexa, (Bing doesn't give you link data, so forget about it) but then there're Exalead, Excite and tons of regional search engines. Search them, get rid of the the duplicates and you'll have a goooooooooooooooogol of competitor's links to study.


Note: Some search engines have a different set of operators so you'll need to type domain: instead of link:.


Getting It All Done Fast


This sure seems like a lot of work and it is. Moreover, getting the links list is only the beginning and the easy part of competitive link research. Once you get the list you need to analyze each link, weed out poor quality sites and only leave the ones you can get a link from. Now THAT's a lot of work.


I'm too lazy to do this all by hand, besides I value my time too much to waste it on such kind of work. That's why I use SEO SpyGlass an advanced link analysis tool that employs all the tricks described in this article (plus some more advanced ones I don't even know) to get up to 25,000 links per domain, which is much, much more than any other tool can get.


SEO SpyGlass also finds all the data I need to analyze the links:


    • Google PR of the domain and linking page
    • The URL and title of the linking page
    • The anchor text and description
    • Whether the link is still on the page (sometimes the link gets removed but search engines will
       think it's there till they reindex the page).
    • Whether the link is no-follow or dofollow
    • How many other links are on the page
    • How much link value the link passes
    • And some other data like TLDs, domain age, country, etc.


If you want to do competitive link research seriously, I'd strongly recommend trying SEO SpyGlass out. And of course you can always use my tricks whenever you want to run a quick background check on that new guy on your block.


Source:-www.Site-Reference.com


SEO Expert in India

Thursday, October 15, 2009

Google SideWiki Encourages Public Graffiti on Any Site




Google has launched a controversial new tool that allows the public to comment on any web site in a side bar displayed in their browser.

Called 
Google Sidewiki, the tool is integrated in the latest version of Google Toolbar and works with both Firefox and Internet Explorer but ironically, not yet Google Chrome. To use Sidewiki, download the latest version of the Google Toolbar and set it to enhanced.

When activated, Sidewiki slides across from the left and becomes a browser sidebar, where you can write entries in a vertical column and read the entries of others. To activate Sidewiki, you simply click on the Sidewiki button in your Toolbar menu or the little talk bubble on the left hand side of your screen.

See: 
http://www.sitepronews.com/images2/sidewiki.jpg

If you've got a Google profile, your image will appear next to your Sidewiki entry. You can either highlight a certain part of a web page, click the Sidewiki button and comment about it, or you can make a general comment about the entire web page. If you've got Sidewiki installed, you can see comments made on the same web site by other members of the public and you can forward your Sidewiki comments to colleagues, friends and family via direct link, email, Twitter or Facebook.


It appears that persons can read the Sidewiki comments sent via link whether they have Sidewiki installed or not. When you're logged into Sidewiki, you'll always see your comments at the top and any others below.

Not only does your Sidewiki entry appear on the original page, but if you have highlighted text, your entry also appears on any webpages that contain the same snippet of text that your comment is about. From the 
official blog post:

"
Under the hood, we have even more technology that will take your entry about the current page and show it next to webpages that contain the same snippet of text. For example, an entry on a speech by President Obama will appear on all webpages that include the same quote. We also bring in relevant posts from blogs and other sources that talk about the current page so that you can discover their insights more easily, right next to the page they refer to."

Rather than viewing them in the order in which they were written, Sidewiki entries are ranked via an algorithm determined by Google:

"
So instead of displaying the most recent entries first, we rank Sidewiki entries using an algorithm that promotes the most useful, high-quality entries. It takes into account feedback from you and other users, previous entries made by the same author and many other signals we developed."

The technology used to determine ranking involves 
large-scale graph computing but other factors are at play, as revealed by Danny Sullivan in his post about Sidewiki. These include use of sophisticated language, complex sentences and ideas, user reputation and user history as revealed by your Google profile and comment contributions. Your comments and others can be thumbed up or down using the "useful - yes or no?" tool, or reported as abuse, further contributing to your user reputation and "Profile Rank" as Danny calls it.

Google have also launched an API that allows developers to work freely with the content created in Sidewiki. Where no comments have been made on a web page, Google may show blog results relating to that page.

The potential applications of Sidewiki are interesting and frightening at the same time. For example, I can see how it could be a useful bookmarking tool, allowing you to make notes about a web site you've found which you could refer to later. You can even embed YouTube videos in Sidewiki (take a look at the Google home page to see this in action).

It also has fantastic potential as an online collaboration tool, letting you annotate the pages on a site in conjunction with team members in a similar way to tracking changes in a MS Word document and sharing document versions via Google Docs.





BUT, (and it's a big but), I can see Sidewiki being open to abuse in a similar way to Searchwiki, Google's comment tool for search engine result pages. Searchwiki has been widely panned in the search industry because it's Notes feature has been exploited by spammers, overactive PR companies and people with a chip on their shoulder about certain web brands. Unfortunately, I see Sidewiki heading in the same direction. And fast.

Any user controlled element of a search engine is open to some level of abuse. But I don't see a huge amount of comment filtering going on yet and have already seen evidence of spamming (view the Microsoft home page with Sidewiki installed and you'll see anti-MS entries like 
this one).

Yes Google have a usefulness rating system in place, a Report Abuse link and are flagging some comments with the disclaĆ­mer "These entries may be less useful" but I doubt their filters will be able to keep up as Sidewiki takes off. There's also going to be the troll factor which will undoubtedly lead to the system becoming worthless if it's not carefully controlled. I've viewed Sidewiki entries on some major sites this past week and it's already starting to feel like Toilet Wall Graffiti 2.0.

Sidewiki has 
program policies but spammers don't care about those and trolls don't read them. Besides, one man's graffiti is another man's gospel.

Google's catch phrase for Sidewiki is: "
Contribute helpful information to any web page". To that, I say: Define helpful.

About The Author

Article by Kalena Jordan, one of the first search engine optimization experts in Australia, who is well known and respected in the industry, particularly in the U.S



Top Five SEO Tips to Increase Website Traffic

1. Extensive Keyword Research 


SEO starts with a solid keyword research and sorting. Most of the SEO professionals are impatient and always in a hurry to implement their ideas. 
Well, that keeps them revising their SEO Services efforts and eventual loss of time and traffic. Keyword research for effective SEO results needs a lot of patience and intelligent calculations. Above all you should be using the best possible tools to perform keyword research for a strong search engine optimization. 
For example Google Keyword Tool, Google Insight and Google Suggestions. You will get the exact number of monthly searches on Google Keyword Tool. Here is what you should do then; 
a) Download them in an excel sheet. 
b) Find out your kind of keywords (that match your website content and products). 
c) Sort them by daily searches. 
d) Collect the best ones for the Home and do similar calculations for the other important internal pages. 


2. Original and Optimized Content 


SEO based content writing is the best element on a live web page. Both, search engine crawlers and human beings love to read original and effective content filled with useful information. However; there has to be an efficient use of SEO content writing techniques as well. A smart and optimized content contains the right balance of the keywords used throughout the content. Usually it should be 5% to 6%. 
a) Use two to three keywords in the whole content. 
b) Start the content with the most important keyword and end it with the same keyword. 


3. Web Page Optimization 


Search Engine Crawler reads only the HTML of the web-page and so it must be properly optimizes to guide the crawlers to the right areas and content of the website. Put the keywords in Title, Description and Add more keywords in the keywords area. Do not forget to put H1 to H6 using the keywords. Alt Tags must be placed on the images wherever possible. 


4. Article Syndication 


Write articles and Press releases about your website, it`s products and services. Submit the articles into as many article submission and press releases into major press releases websites. Do not forget to leave anchor texts so that readers can come to your website as well. 
This practice is easy and very effective. It helps the website gets crawled by the search engines within two days which reinforces the natural ranking of the website.If you leave anchor texts for the internal pages as well then you can expect to get your website`s internal pages caches (known to search engines). This boosts the website`s presence even more. 


5. Social Bookmarking 


It is very important to be a highly actively involved into social bookmarking activities. Join all the major Social Bookmarking websites like dig, delicious etc. Bookmark your own articles and press releases over there, make friends and share your bookmarks. 
Social bookmarking websites are very well optimized and are cached by search engines everyday. If your website`s content, link etc is bookmarked over there then you can expect your website to get cached by Google within twenty four hours as well. 


Try the above mentioned SEO ideas in your daily SEO work and experience the change in the results yourself.


Website SEO Services 
Web Design Company 

How to Play the Game of Paid URL Inclusion


There are many ways to promote your website and one of the most efficient ways is to use search engines. Search engines are the first stop for most people trying to find information, services, and products online. Because of this, it is essential that your website appears quickly in search results.

The Internet contains numerous search engines, some of which provide what is known as "paid inclusion." This means that you pay the specific search engine an annual fee for your web page to be included in their index.

Of course, every search engine already has an automated program commonly called a "spider" that indexes all the web pages it locates online, and it does this for free. So whether you pay or not, your web page will eventually be indexed by all Internet search engines, as long as the spider can follow a link to your page. The major issue is, then, how quickly your page is indexed.

A search engine that offers a paid URL inclusion uses an extra spider that is programmed to index the particular pages that have been paid for. The difference between the spider that indexes pages for free and the spider that indexes only pages for a fee is speed. If you have paid for inclusion, the additional search engine spider will index your page immediately.

The debate over paid URL inclusion centers around the annual fee. Since the regular spider of these search engines would eventually get around to indexing your web page anyway, why is a renewal fee necessary? The fee is necessary to keep your pages in the search engine's index. If you go the route of paid inclusion, you should be aware that at the end of the pay period, on some search engines, your page will be removed from their index for a certain amount of time.

It's easy to get confused about whether you would benefit from paid inclusion since the spider of any search engine will eventually index your page without the additional cost. There are both advantages and disadvantages to paid URL inclusion, and it is only by weighing your pros and cons that you will be able to decide whether to spring for the cost or not.

The advantages are obvious: rapid inclusion and rapid re-indexing. Paid inclusion means that your pages will be indexed quickly and added to search results in a very short time after you have paid the fee. The time difference between when the regular spider will index your pages and when the paid spider will is a matter of months. The spider for paid inclusion usually indexes your pages in a day or two. Be aware that if you have no incoming links to your pages, the regular spider won't locate them at all.

Additionally, paid inclusion spiders will go back to your pages often, sometimes even daily. The advantage of this is that you can update your pages constantly to improve the ranking in which they appear in search engines, and the paid URL inclusion spider will show that result in a matter of days.

First and foremost, the disadvantage is the cost. For a ten-page website, the costs of paid URL inclusion range from $170 for Fast/Lycos to $600 for AltaVista, and you have to pay each engine their annual fee. How relevant the cost factor is will depend on your company.

Another, and perhaps more important, disadvantage is the limited reach of paid URL inclusions. The largest search engines, Google, Yahoo, and AOL, do not provide paid URL inclusion. That means that the search engines you choose to pay an inclusion fee will amount to a small fraction of the traffic to your site on a daily basis.

Google usually updates its index every month, and there is no way you can speed up this process. You will have to wait for the Google spider to index your new pages no matter how many other search engines you have paid to update their index daily. Be aware that it is only after Google updates their index that your pages will show up in Google, Yahoo, or AOL results.

One way to figure out whether paid URL inclusion is a good deal for your company is to consider some common factors. First, find out if search engines have already indexed your pages. To do this, you may have to enter a number of different keywords, but the quickest way to find out is to enter your URL address in quotes. If your pages appear when you enter the URL address but do not appear when you enter keywords, using paid inclusion will not be beneficial. This is because your pages have already been indexed and ranked by the regular spider. If this is the case, your money would be better spent by updating your pages to improve your ranking in search results. Once you accomplish this, you can then consider using paid inclusion if you want to speed up the time it will take for the regular spider to revisit your pages.

The most important factor in deciding whether to use paid URL inclusion is to decide if it's a good investment. To figure this out, you have to look at the overall picture: what kind of product or service are you selling and how much traffic are you dependent on to see a profit?

If your company sells an inexpensive product that requires a large volume of traffic to your site, paid inclusion may not be the best investment for you; the biggest search engines do not provide it, and they are the engines that will bring you the majority of hits. On the other hand, if you have a business that offers an expensive service or product and requires a certain quality of traffic to your site, a paid URL inclusion is most likely an excellent investment.

Another factor is whether or not your pages are updated frequently. If the content changes on a daily or weekly basis, paid inclusion will insure that your new pages are indexed often and quickly. The new content is indexed by the paid spider and then appears when new relevant keywords are entered in the search engines. Using paid inclusion in this case will assure that your pages are being indexed in a timely manner.

You should also base your decision on whether or not your pages are dynamically generated. These types of pages are often difficult for regular spiders to locate and index. Paying to include the most important pages of a dynamically generated website will insure that the paid spider will index them.

Sometimes a regular spider will drop pages from its search engine, although these pages usually reappear in a few months. There are a number of reasons why this can happen, but by using paid URL inclusion, you will avoid the possibility. Paid URL inclusion guarantees that your pages are indexed, and if they are inadvertently dropped, the search engine will be on the lookout to locate them immediately.

As you can see, there are numerous factors to consider when it comes to paid URL inclusion. It can be a valuable investment depending on your situation. Evaluate your business needs and your website to determine if paid URL inclusion is a wise investment for your business goals.

Source or About The Author
Nelson Tan is the webmaster behind Internet Mastery Center. Download $347 worth of Free Internet Marketing gifts at www.internetmasterycenter.com.

Thursday, October 8, 2009

Google practices dividing to conquer


SAN FRANCISCO--Google's 8 billion-plus Web document index may not multiply, but its search engine will learn to better divide the data.

That was part of the message from Peter Norvig, Google's director of search quality, who on Tuesday gave a keynote speech here at the Semantic Technology Conference. Norvig, a former NASA employee and an author of books on artificial intelligence, highlighted several research projects the company is developing to help classify data and improve the relevance of search results.

Those projects focus on adding new clustering capabilities for search results, providing suggestions for related searches, personalizing listings, and returning factual answers to specific questions, Norvig said.
"We want to have a broader bandwidth for that kind of communication," Norvig said. "It's a question of what's the right language."

Despite heavy competition in recent years to own the largest document index, Norvig also said he couldn't foresee Google's database adding many more Web documents without cataloging bogus or useless pages. Still, the company has numerous programs to add otherwise inaccessible data, like that from books and TV shows, to its Web search engine.
Norvig highlighted a research paper written by a Google employee last year regarding a classification engine the company is testing. The technology can parse a proper noun or compound nouns into several categories in order to deliver clustered results, for example. For a query on "ATM," or asynchronous transfer mode, the engine would be able to use the terms "such as" on Web pages indexed with the term to discover that it can be linked to the expression "high-speed networks." As a result, a search for high-speed networks might pull up a cluster on ATM.

Norvig said the same technology could be used to mine factual answers from the Web for queries like "President Lincoln's birth date." The technique could offer an edge over Microsoft's recent addition of encyclopedic answers to its database, thanks to its Encarta software, Norvig said. That's because MSN's engine could miss the chance to deliver the desired factual answer if the searcher's query is inexact. In contrast, Google draws on the semantic Web and various language sets from pages to find a match.
 

Norvig also demonstrated Keyhole, Google's satellite mapping service. He said that over time, the company will greater integrate its maps and local information on businesses and places. "It's important to deliver information about the real world as people carry devices around," he said.

Share This

Bookmark and Share

Search This Blog