.

Best Tips For Google Image Search - How to Index image in Google Image Search

Posted by Extreme on , under | comments (4)



Best Tips For Google Image Search - How to Index image in Google Image Search

How to get Images Indexed by Google Image Search
Google Image Search could be used in many ways.

1. If you want to know if a person is a man or a woman and the name doesn't help, do a search for the name.

2. If you don't know the meaning of a word, the pictures may help you.

3. A better search for Flickr. Google uses information from other sites that link to Flickr photos, so you may find Google search better.

4. Find what's interesting about a site, by looking at the pictures included. For example: wired.com.

5. Find a new wallpaper for your desktop by restricting your search to large images. You can automate this using an application.

6. Find random personal pictures, using standard file names from digital cameras.

7. Type the name of a painter and you can take an art class.

8. Install a Greasemonkey script so you can view the original version of the image directly by clicking on the thumbnail.

9. Find the color of a word. "Word Color is a windows program that uses Google Image Search to determine the color of a word or string of words. It goes out there, retrieves the top 9 images and loops through all pixels, calculating the average hue, which is later converted to a color."

10. If you want to grab search results, GoogleGrab is a tool that downloads images from Google Image Search. It even supports batch search.

Best Tips to Protect Your Personal Email

Posted by Extreme on , under | comments (12)



E-mail and instant messaging (IM) are terrific forms of communication for keeping in touch with family, friends, and business associates. However, using them unwisely may make you and your computer susceptible to spam, phishing scams, viruses, and other online threats. Here are some tips to avoid these problems:

1. Obtain comprehensive security software. Be sure that the security software you select is like McAfee Internet Security and protects you and your PC from viruses, worms, Trojans, unwanted e-mail (spam), phishing scams, and other malicious software. It should also have a firewall like McAfee’s products, which can monitor your Internet connection and stop unwanted traffic to and from your PC. Be sure to keep your security software up-to-date. Ideally, you want it to be like McAfee’s Internet Security Suite that has automatic updates and upgrades.

2. Share your e-mail address with only trusted sources. Only your family, friends, and trusted business contacts should have your personal e-mail address. Do not post your e-mail address on Web sites, forums, or in chat rooms. If you post your e-mail address, you are vulnerable to receiving spam or having your e-mail passed on to others. If you would like to subscribe to a newsletter or Web site and receive confirmation e-mail for online transactions, consider using a generic e-mail address that is not linked to any of your personal information. An example of a generic e-mail address is giraffe@samplee-mail.com.

3. Be careful when opening attachments and downloading files from friends and family or accepting unknown e-mails. You can obtain a virus, worm, or Trojan simply by opening e-mail and attachments, and by accepting files from your friends, family, or others. If you choose to download files, make sure your security software is enabled and pay close attention to any warnings provided.

4. Be smart when using IM programs. If you use an IM program to communicate with friends and family, be careful when sending any personal information. Protect yourself by using a nickname for your IM screen name. Never accept strangers into your IM groups. Be smart about how you use your personal IM at work because your employer may monitor and view your personal messages.

5. Watch out for phishing scams. Phishing scams use fraudulent e-mails and fake Web sites, masquerading as legitimate businesses, to lure unsuspecting users into revealing private account or login information. To be safe, if you receive an e-mail from a business that includes a link to a Web site, make certain that the Web site you visit is legitimate. Instead of clicking through to the Web site from within the e-mail, open a separate Web browser and visit the business’ Web site directly to perform the necessary actions. You can also verify that an e-mail is in fact from a legitimate business by calling the business or agency directly.

6. Use e-mail wisely. E-mail is a great way to keep in touch with friends and family, and as a tool to conduct business. Even if you have good security software on your PC, your friends and family might not have the same protection. Be careful about what information you submit via e-mail. Never send your credit-card information, Social Security number, or other private information via e-mail.


7. Do not reply to spam e-mail. If you don’t recognize the sender, don’t respond. Even replying to spam mail to unsubscribe could set you up for more spam.
8. Create a complex e-mail address. With a complex e-mail address, it makes it more difficult for hackers to auto-generate your e-mail, send spam e-mail, or target your e-mail for other types of attacks. Make sure you come up with an e-mail address that you can easily remember. Try to use letters, numbers, and other characters in a unique combination. Substitute numbers for letters when you can. A sample complex e-mail is: Tracy3Socc3r2@samplee-mail.com.

9. Create smart and strong passwords. Make it difficult for hackers to crack your password. You can create a smart password by incorporating capital letters, numbers, special characters and using more than six characters. An example of a strong password is: Go1dM!n3.


10. Never enter your personal information into a pop-up screen. Sometimes a phisher will direct you to a real organization’s Web site, but then an unauthorized pop-up screen created by the scammer will appear, with blanks in which to provide your personal information. If you fill it in, your information will go to the phisher. Install pop-up blocking software to help prevent this type of phishing attack.

WordPress Ping List : Best List of WordPress Ping URLs

Posted by Extreme on , under | comments (10)



WordPress Ping List : Best List of WP Ping URLs
Welcome to Wordpress Pinglist. We've searched the Internet for the best and most up to date wordpress pinglist sites for use on your blogs. Finding them can be tough, and finding an updated list can be even more work. We've made it easy with our list. Just copy and paste into your wordpress site and you're good to go.

This is a recommended ping list for a WordPress blog. The information is gathered from personal experience and various sources on the web.

Every time you post these services will be notified of your blog post increasing your online exposure.

To use this list, copy and paste this list to your Settings->Writing tab in the admin panel.

http://api.moreover.com/RPC2
http://bblog.com/ping.php
http://blogsearch.google.com/ping/RPC2
http://ping.weblogalot.com/rpc.php
http://ping.feedburner.com
http://ping.syndic8.com/xmlrpc.php
http://ping.bloggers.jp/rpc/
http://rpc.pingomatic.com/
http://rpc.weblogs.com/RPC2
http://rpc.technorati.com/rpc/ping
http://topicexchange.com/RPC2
http://www.blogpeople.net/servlet/weblogUpdates
http://xping.pubsub.com/ping

More Links = Less Page Rank?

Posted by Extreme on , under | comments (2)



Pagerank Explained. Google's PageRank and how to make the most of it

It became an interesting and repeating observation in recent months.

Site A has PR 7 backlink and 3 backlinks total. Site B has PR7 backlink and 600 backlinks total.

Site A got PR6 while Site B has only PR3.

This almost implies that the pagerank flowing into the site is 'averaged' over all incoming links. Meaning, the larger number of crap links (low PR links, no Pr or PR 0) you have, the smaller your resulting pagerank is as these links devalue any high PR links you might have.



How is PageRank calculated?
To calculate the PageRank for a page, all of its inbound links are taken into account. These are links from within the site and links from outside the site.

PR(A) = (1-d) + d(PR(t1)/C(t1) + ... + PR(tn)/C(tn))

That's the equation that calculates a page's PageRank. It's the original one that was published when PageRank was being developed, and it is probable that Google uses a variation of it but they aren't telling us what it is. It doesn't matter though, as this equation is good enough.

In the equation 't1 - tn' are pages linking to page A, 'C' is the number of outbound links that a page has and 'd' is a damping factor, usually set to 0.85.

We can think of it in a simpler way:-

a page's PageRank = 0.15 + 0.85 * (a "share" of the PageRank of every page that links to it)

"share" = the linking page's PageRank divided by the number of outbound links on the page.

A page "votes" an amount of PageRank onto each page that it links to. The amount of PageRank that it has to vote with is a little less than its own PageRank value (its own value * 0.85). This value is shared equally between all the pages that it links to.

From this, we could conclude that a link from a page with PR4 and 5 outbound links is worth more than a link from a page with PR8 and 100 outbound links. The PageRank of a page that links to yours is important but the number of links on that page is also important. The more links there are on a page, the less PageRank value your page will receive from it.

If the PageRank value differences between PR1, PR2,.....PR10 were equal then that conclusion would hold up, but many people believe that the values between PR1 and PR10 (the maximum) are set on a logarithmic scale, and there is very good reason for believing it. Nobody outside Google knows for sure one way or the other, but the chances are high that the scale is logarithmic, or similar. If so, it means that it takes a lot more additional PageRank for a page to move up to the next PageRank level that it did to move up from the previous PageRank level. The result is that it reverses the previous conclusion, so that a link from a PR8 page that has lots of outbound links is worth more than a link from a PR4 page that has only a few outbound links.

Whichever scale Google uses, we can be sure of one thing. A link from another site increases our site's PageRank. Just remember to avoid links from link farms.

Note that when a page votes its PageRank value to other pages, its own PageRank is not reduced by the value that it is voting. The page doing the voting doesn't give away its PageRank and end up with nothing. It isn't a transfer of PageRank. It is simply a vote according to the page's PageRank value. It's like a shareholders meeting where each shareholder votes according to the number of shares held, but the shares themselves aren't given away. Even so, pages do lose some PageRank indirectly, as we'll see later.

Ok so far? Good. Now we'll look at how the calculations are actually done.

For a page's calculation, its existing PageRank (if it has any) is abandoned completely and a fresh calculation is done where the page relies solely on the PageRank "voted" for it by its current inbound links, which may have changed since the last time the page's PageRank was calculated.

The equation shows clearly how a page's PageRank is arrived at. But what isn't immediately obvious is that it can't work if the calculation is done just once. Suppose we have 2 pages, A and B, which link to each other, and neither have any other links of any kind. This is what happens:-

Step 1: Calculate page A's PageRank from the value of its inbound links

Page A now has a new PageRank value. The calculation used the value of the inbound link from page B. But page B has an inbound link (from page A) and its new PageRank value hasn't been worked out yet, so page A's new PageRank value is based on inaccurate data and can't be accurate.

Step 2: Calculate page B's PageRank from the value of its inbound links

Page B now has a new PageRank value, but it can't be accurate because the calculation used the new PageRank value of the inbound link from page A, which is inaccurate.

It's a Catch 22 situation. We can't work out A's PageRank until we know B's PageRank, and we can't work out B's PageRank until we know A's PageRank.

Now that both pages have newly calculated PageRank values, can't we just run the calculations again to arrive at accurate values? No. We can run the calculations again using the new values and the results will be more accurate, but we will always be using inaccurate values for the calculations, so the results will always be inaccurate.

The problem is overcome by repeating the calculations many times. Each time produces slightly more accurate values. In fact, total accuracy can never be achieved because the calculations are always based on inaccurate values. 40 to 50 iterations are sufficient to reach a point where any further iterations wouldn't produce enough of a change to the values to matter. This is precisiely what Google does at each update, and it's the reason why the updates take so long.

One thing to bear in mind is that the results we get from the calculations are proportions. The figures must then be set against a scale (known only to Google) to arrive at each page's actual PageRank. Even so, we can use the calculations to channel the PageRank within a site around its pages so that certain pages receive a higher proportion of it than others.

SEO Smart Links : Essential Wordpress Plugin

Posted by Extreme on , under | comments (3)



SEO Smart Links Plugin for WordPress
SEO Smart Links can automatically link keywords and phrases in your posts and comments with corresponding posts, pages, categories and tags on your blog.

Further SEO Smart links allows you to set up your own keywords and set of matching URLs.

Finally SEO Smart links allows you to set nofollow attribute and open links in new window.

Everything happens completely transparent, and you can edit the options from the administration settings panel.

How it Works?

SEO Smart Links looks for keyword phrases that match the titles of your posts and pages by default (and you can enable categories and tags matching too). These phrases are then turned into the links. The matching is case insensitive and the original case is preserved.

So If I mention Amazing Grace, which is my theme and also the title of one of my pages, it will be automatically converted into a link.

Everything happens completely transparent, and you can edit the options from the administration settings panel.


































Features:

  • Find keywords in your posts, pages and comments and link them to your other posts, pages, categories and tags
  • Full control with customizable options
  • Ignore list for keywords you do not want to link
  • Improves your site's interlinking
  • Control external links with custom keywords
  • Add nofollw attribute or open links in new window
  • Caching for speed - make sure you have define('ENABLE_CACHE', true); set in your wp-config.php

Download



  • Download

    Download Latest Version


    Installation & Usage
    1. Upload the whole plugin folder to your /wp-content/plugins/ folder.
    2. Go to the Plugins page and activate the plugin.
    3. Use the Options page to change your options
    4. That is all.



      How do I correctly use this plugin?

      Just install activate it on your blog. SEO Smart links will be default find matching links to your post and pages (if the keyword in your text matches their title).

      Default options are enough to get you going. If you want to tweak it, you can always edit the options. Be sure to check "ignore" options where you can state what keywords and phrases not to link.

      How do I enable SEO Smart Links cache?

      Make sure you have enabled WordPress cache by adding this line to your wp-config.php

      define('ENABLE_CACHE', true);

      Please be careful when editing this file and always make a backup!

      Changelog

      v2.1
      /> - Performance optimization and new option to link to cats and tags with used at least n times (thanks Domink Deobald!)

      v2.0

      - Added the option for case sensitive matching

      v1.9
      /> - Various improvements and bug fixes

      v1.8.0
      /> - Added support for non-english characters (Unicode)
      /> - Added support for keyword synonyms (in keywords section of the settings screen)

      v1.7
      /> - Performance optimization of the plugin. SEO Smart Links causes much lesser strain on the server.

      v1.6
      /> - Added option to process only single posts and pages

      v1.5
      /> - Added nofollow and new window options

      v1.4
      /> - Added option for custom keywrods. No you can specify unlimited numbers of your keywords and url's to link to

      v1.3:
      /> - Enabled caching for speeding queries up.

      v1.2:
      /> - Added limits options
      /> - Fixed small bugs

How to Import/Export a Blog to Blogger

Posted by Extreme on , under | comments (1)



How to Import/Export a Blog to Blogspot

New Feature: Import and Export in Blogspot
Today’s release brings another long-desired feature to Blogger: Import and Export of blogs. Now you can export all of your posts and comments into a single, Atom-formatted XML file for easy backup. You can then import the posts back into Blogger, either into an existing blog or into a new one.













To export your blog, log in to http://draft.blogger.com/ and go to the Settings > Basic page. You’ll see the Blog Tools links at the top of the page for importing and exporting. (We also moved blog deletion up here from the bottom of the page. Don’t worry about accidentally clicking it, though; your blog wouldn’t be deleted until you confirmed on the next page.)

Once you click “Export blog” and press the “Export” button on the next page, your browser will prompt you to save the XML file for your blog. Keep it somewhere safe as a backup, or import it into a different blog. You can import one blog into another from the Blog Tools links, or when creating a new blog. Look for the “Advanced Options” at the bottom of the page.
















When you import a blog, all of the posts will get saved in an “imported” state. From there you can publish just a few, or all of them at once. Here are some ideas for what you can do with importing and exporting:

•Merge two or more blogs into one. Take the exported posts and comments from one blog and import them into another one.
•Move individual posts from blog to blog. After importing, select just a set of posts to publish and publish them with one click.
•Back up your blog to your own storage. You can keep your words safe and under your control in case anything happens to your blog, or us, or if you want to remove them from the Internet.
•Move your blog somewhere else. Our export format is standard Atom XML. We hope to see other blogging providers extend their Atom support to include import and export. And, if you decide to come back to Blogger, importing your export file will get you back up and running in seconds.
Caveats

•The export format currently only covers blog posts and comments to those posts, not blog settings or templates. To back up a Classic template, copy and paste the template code from the editor. To back up a Layouts template, use the Backup / Restore template option to download a copy of your template.
•Before importing a blog for the first time, we recommend that you create a new, throwaway blog to import into so you get a sense for how the process works. Once you’re comfortable, import into your public blog.
•At the moment there is a 1MB size limit on the blog you can import. This is a bug that we are correcting the issue.

Have you imported or exported your blog? Let us know about how it went in the comments.

Get Targeted Backlinks to your blog with Comment Kahuna

Posted by Extreme on , under | comments (1)



Every blogger, unless the blog is exclusively setup for private and limited purposes, is interested in gaining more quality traffic and better blog ranking in search engines. One of the widely recognized approaches to improve the position and attract more visitors is receiving quality backlinks. However, while task sounds simple and self-explanatory, the realization step is quite complicated. How can you encourage another blogger, preferably with good Google PageRank standing, to post link on you blog?



Comment Kahuna is one of the best free tools on the market, that will assist you in reaching your goal. It is a simple, but efficient utility that helps you in searching blogs related to the niche of your website. Use it to offer informative comments on the blog posts. This will give your website quality backlinks that search engines looks for while evaluating your blog position. Many of these blogs get lots of visitors, so if your comment is interesting, relevant and informative, some will follow the link to your website.


I checked multiple reviews on the web for the user comments, and most of them are super positive, so try for yourself. It is absolutely free, but the registration on site is required.


Website: http://commentkahuna.com

Disadvantages of Reciprocal Links Exchange

Posted by Extreme on , under | comments (4)



What is Reciprocal Links Exchange

Reciprocal linking is an agreement between two web site owners to provide a hyperlink within their own web site to your web site. Generally this is done to provide readers with quick access to related sites, or to show a partnership between two sites. Reciprocal links can also help to increase traffic to your web site in two ways. Firstly, you will probably have some viewers visit your site from clicking the reciprocal link directly. Secondly, most Internet search engines take into account the number of web sites which contain links to your web site; the more hyperlinks to your site found, the higher up in the search engine rankings (depending on the search term) you'll find your site.
Reciprocal linking between web sites became an important part of the search engine optimisation process thanks to the link popularity algorithm PageRank employed by Google, which ranks web sites for relevancy dependent on the number of links that led to a particular page and the anchor text of the link.

For years people have said you need to submit to the search engines. With the advances in search engine spider technology, it is no longer a requirement to submit to the search engines. Rather it is a requirement to submit to topical directories. Reciprocal links and topical directories help you move forward and open the doors for more people and search engine spiders to find your site.

Three-way linking
Three-way linking (site A → site B → site C → site A) is also a kind of reciprocal linking but a special one. This type of linking is becoming increasingly popular since some search engines give less consideration to the value of normal reciprocal links (site A → site B → site A). Some search engines give a higher value to web sites that link to another web site without that web site displaying a return link, three way linking fits this critera while also offering a reciprocal return link from say a sister site.
The attempt of this link building method is to create more "natural" links in the eyes of search engines. The value of links by three-way linking can then be better than normal reciprocal links, which are usually done between two domains.

Automated Linking
In order to take advantage of the need for inbound links to rank well in the search engines, a number of automatic link exchange services have been launched. Members of these schemes will typically agree to have several links added to all their web pages in return for getting similar links back from other sites.

Link Exchange
An alternative to the automated linking above is a Link Exchange forum, in which members will advertise the sites that they want to get links to, and will in turn offer reciprocal or three way links back to the sites that link to them. The links generated through such services are subject to editorial review.



Disadvantages of Reciprocal Link Building

I would like to list below some disadvantages of reciprocal links exchange.

1. Time and effort is higher than one-way link building practices.
2. If we exchange links with bad neighbor, our site may get penalized.
3. Good neighbor may become bad in future and put us in trouble.
4. Link exchange partner may remove our link without our knowledge while we still link back.
5. Search engines discourage link exchange.

Guide To One Way Link Building - One Way Link Building Secures Long Term Ranking Results

Posted by Extreme on , under | comments (2)



Guide To One Way Link Building - One Way Link Building Secures Long Term Ranking Results
Link-building should constitute no more than 10-15% of your search engine optimization practice. You want your linking to be efficient, effective, and long-lasting. Here are a few link building tips to help you improve your search engine optimization practice and your search results.

Links start at home
Most people now agree that internal navigation is extremely important to the search optimization process.

1.You want to use concise, meaningful, keyword-rich anchor text in your navigation
2.You want to provide alternative, content-embedded links to deeper content in the site where it makes sense
3.You want to use alternative anchor text for secondary on-page links
4.You want to provide cross-promotional links on as many pages as possible in as unobtrusive a way as possible
Keeping in mind that the user experience should be your first priority, you want to strategically position links throughout your site and your pages. Your navigation should serve both the user’s needs and your own self-promotional needs. But your content provides you with many opportunities to inform your visitors about relevant, related content.

Avoid PageRank manipulation schemes. They are ineffective wastes of your time. Instead of putting rel=’nofollow’ on links pointing to your “About Us”, “Terms of Use”, “Contact Us”, and other important pages, use those pages to point your visitors to content you want them to find. People search for these pages so you need to ensure they can find them. Remember, the user experience has to come first, not the latest SEO fad.

If you’ve developed a cross-promotional link widget (server side includes make this extremely easy to implement), you can embed it on your so-called incidental pages to help people get to the content you value most. In fact, you SHOULD be embedding that widget on those pages. You should be using those pages in every way possible to help the rest of your site.

The alternative anchor text that I suggest for secondary on-page links may puzzle you. We know that in some rare situations Google passes the second link’s anchor text. However, the consensus point of view (which I support) is that you probably don’t want to invest your time in placing multiple links to one destination on a page. However, there are time where the Web designer may feel this enhances usability.

If you’re going to embed a secondary navigational link structure on your page, you might as well as use the anchor text to describe those pages with alternative keywords. Regardless of whether that passes any value to the destinations, it gives you an opportunity to use alternative keywords in an appropriate on-page context AND it may help some of your visitors.

Keyword anchor text is relevant to both the linking page and the destination page. A lot of people in the SEO industry have gone to forums and asked why such-and-such page appears in search results when it has no links pointing to it with matching anchor or any apparent paragraphs or titles using the keywords. Every time I have looked at these mystery pages I have found outbound links using the keywords.

SEOs have a huge blind spot for the on-page value that outbound link anchor text provides. It’s still part of the on-page text. It’s not like the link is being treated as something separate from the page.


Leverage your friends’ sites
It’s a time-honored tradition that we all advise our clients to get their moms, pops, and cousins to link to their sites. Frankly, just dropping a link somewhere on a random site doesn’t provide much value in my opinion. I’d rather have a link from a relevant article.

Not some freebie article the friend picked up off the Web and dropped the link into. Rather, I’d want to write the article. It doesn’t have to be a Michael Martinez-quality 50-paragraph text marathon. Just 2 paragraphs of 50-100 words each can provide your friends and relatives with enough text to add a page to their sites.

If you make these micro-essays relevant to both your friends’ content and your own site, you can reasonably embed an author’s credit link that uses link anchor text to help your site. But that’s just cream.

The real value in creating these micro-essay pages is that they stand pretty good chances of ranking well for long-tail queries. I have tons of referral traffic from sites that carry essays I’ve written. So what if their links don’t pass anchor text in Google? They’re sending me qualified traffic. That’s called free advertising in a qualified marketing channel.

Can this technique be abused? Sure. It’s being abused right now. There are link brokers out there selling pay-per-post links. There are people calling and harrassing honest Webmasters like me, asking if they can put content on our sites.

For this kind of link building, you MUST draw the line at the end of your friends and relatives. I’m not interested in hosting some cheap schlocky advertorial on my site. But if a friend wants to put an essay on my site, you’d better believe I’ll be very, very agreeable (in fact, I HAVE allowed friends to post essays on my personal sites more than once through the years).

SEOs write guest essays for each other’s blogs. You can create guest pages for your friends. They get some interesting content and you get a link from a page that has a good chance of ranking for something that sends qualified traffic to your site.

Directories, Directories, Directories
For about $100 you can get someone in India to drop your link on 1500-2000 worthless directories. I’ll admit that I toyed with the idea of using one such service last year but after looking at the list of directories I decided there was no traffic value. Maybe I would have gotten some temporary PageRank and anchor text, but we all know Google hates those guys (and probably also Yahoo! and Microsoft).

Directory submission is certainly a useful link building technique, but you should personally vet the directories. Remember, the link you get may not pass PageRank or anchor text but it could still pass traffic. I’ll take a nofollowed link that passes traffic any day of the week, so evaluate the directory. Use third-party resources like Alexa, Compete, and Quantcast to estimate how much traffic they may actually receive. Those estimates won’t be accurate but a dead directory is not likely to show up on those services’ radars, and you just want to know if a directory actually gets traffic.

The fewer ads per page you see on these low-value diretories, the better. But if they use the expression “SEO friendly” or in any way talk about the value of links, etc. then don’t even bother submitting to them.

You should set yourself a limit, too. I would not see a need for most sites to submit to more than 20-30 directories.

Micro sites, off-site blogs, and co-owned sites
Someone recently went to a popular SEO forum and asked if they should invade a new query space with micro sites. The poster claimed to be working with an “authority” site (whatever that is). I and a whole lot of other people suggested the micro sites would be counter productive.

If you have a domain that is well-established in the search results, it’s been around for a few years, and you want to take on a new query space, just put the content on the domain you have.

Micro sites serve useful purposes in many contexts, but they are abused by a lot of inexperienced SEOs and spammers. They are also created wastefully by businesses that don’t understand Internet marketing.

The most compelling reason for creating a micro site (in my humble opinion) is to establish value for a standalone brand. But there are other less compelling reasons that make sense in the right situation as well. You can think of topics and content three-dimensionally, for example, and create concept sites that provide your visitors with tightly related but extremely different content. You literally split a brand value across multiple sub-brands.

This approach works well in the entertainment Web for television shows and movies, as well as online games. User communities, supplemental information, and news/interviews are often posted on micro sites in a network of closely associated sites. The primary brand is the official Web site of the television show or movie. Rarely will you find such a site linking out to other sites (beyond its satellite sites). More often the studios put their outbound links on one of the micro sites, all of which also link to the primary brand site.

These satellite site networks succeed because each site brings enough unique content and concepts to the user experience that it makes sense for people to link to them all (or to more than one of them). News articles link to multiple sites in these networks, fan sites link to them, forums and blogs link to them, etc. It’s a great strategy when you have the right concepts and content to work with.

But understand that the best, most successful micro sites are built on concepts, not content. That’s a HUGE distinction and a lot of SEOs seem to be completely unaware of it.

Still, a lot of business sites can employ the micro site strategy successfully without having to build online games and user forums. For example, you can set up your company blog on another domain. Some SEOs say it looks unprofessional to put a blog on Blogspot or Wordpress. Frankly, I disagree with them. But if you feel that’s not an option, there is no reason why you cannot put your business blog on its own domain or sub-domain. You can link to the business site.

Technically, in a network all the sites tend to link to each other, but you give yourself links from newer content on satellite sites to deeper content on the primary site. A blog is more likely to attract new links (if you post to it regularly) than, say, an article archive. If you publish press releases on a regular basis you may have sufficient content to create a press release sub-domain or domain. All of these types of sites, however, should serve unique purposes that don’t really fit in well with the primary domain’s concept.

Companies that operate complementary brands (you see a few such companies in the travel and lodging industries) can usually cross-link their branded domains provided they don’t get greedy. The holding companies rarely provide much content under their own names. However, I have found a few holding companies that actually issue press releases or post other unique, promotional content on their corporate sites that help promote (and link to) the brand sites.

The old press release trick
Ah, yes. The old press release trick. Matt Cutts has indicated that Google is on to the old press release trick. You know what I’m talking about: you run around to 10-15 press release distribution sites and post the same badly written pseudo-press release announcing to the world that you’ve assigned yourself a designated parking space. And the press release links back to your site.

Some press release venues actually drive traffic to Web sites. I know because I’ve published press releases and have seen the referrals. Traffic is a lovely thing to have, and if you can publish a decent press release that is well-written and likely to capture someone’s imagination, it may be worthwhile. Just understand you need to spread the cost of publication across the visitors you receive. You’re not paying for a value-passing link.

But can press releases still pass value? Sure they can. Maybe some services pass value and others don’t. But you know, there are Web sites that will republish press releases for you without charging a fee AND they’ll give you a link. How do you find these sites? Look for them in places where other SEOs are not looking.

There are many businesses that can get other sites to post content about them if the businesses stop thinking in terms of “money for links”. You need to think of terms of “reaching out to a community that is interested in what I do”, and “reaching out to Web sites that want content that I can provide them”.

If your business participates in trade shows, makes presentations at schools or community organizations, sponsors community competitions or sports teams, or otherwise engages with people outside of its normal business operations, the chances are pretty good there are Web sites out there where you can get a blurb, a press release, a custom-written article, or some sort of announcement posted with a link back to your site.

These are the kinds of sites that SEOs cannot drag all their clients to (but don’t be surprised if someone has hammered a few Webmasters in these categories). These sites may be run by local community organizations, local governments, schools, or enthusiasts. They may have small audiences or large audiences. They may be connected to other sites that will pick up their news and events announcements. One link can spread across many sites if you provide compelling enough content.

So, how do you find out if these sites will carry your content? Ask them.

Ask them without including the content.

Ask them before you send the content.

Ask them without talking about search engine optimization.

Ask them without mentioning Google, Yahoo!, or any other search engine.

Contact the site operator and treat him or her like a human being. Don’t shower them with faux compliments about how great their site is.

The people who have been most successful in getting me to post their content on my network have been very honest, very direct. “Michael, our company does X, Y, and Z. We want to work with Xenite.Org to create some content that we feel would be of interest to your visitors. Please contact me for more information.”

Dudes, I’ve picked up the phone and called people right away when I’ve received messages like that. They don’t come from the SEO industry. They come from television producers, TV and movie studios, manufacturers, and other folks. I’ve never seen an SEO be so straight-forward, so honest, and so willing and able to help me create content I can use.

You want a link? Give me unique content that no other site is allowed to have. I’ll give you a link in a heartbeat. If there is a magic bullet in link building, unique, original content is the magic bullet for link building.

I’ve stopped promoting companies whose marketing campaigns shifted from providing unique, original content to sending out press releases to mailing lists. As soon as you put me on a mailing list you’re dead in the water. As soon as you assume the sites you’ve contacted are all willing and able to republish every piece of tripe you send out, you’re off my radar.

You get value for value. It’s a simple exchange.

Okay, that’s not a press release trick, but that IS where it came from. The old press release trick can work if you don’t treat it like it’s a machine that punches out die-stamped links on demand.

The old blog comment trick
Here on SEO Theory, (and over at Best SEO Blog), we regularly delete (and report as spam) posts from people who drop by to say, “Great post Michael, you’ll find I wrote about the same thing at [some Web address I have never heard about]“.

As someone who has leveraged blogs to build traffic and visibility, I feel there are some boundaries that should be respected. Here are the guidelines I try to follow when posting comments on other blogs:

1.As much as possible I just embed my link in the link field that converts my posting name to a link
2.If I feel compelled to drop a link in the message, I try to link to someone else’s site so that I’m making a legitimate recommendation
3.If I feel like I really have something I’ve written that is worthy of note, I drop the link on a blog where I have an established history of participating in discussions and to which I have myself linked
As we all know blog comment spam has been around for years and it’s not likely to go away. Even if the links are nofollowed (and even if your tests suggest that Google occasionally follows nofollowed links), the best value you can get from any link in a blog comment is visibility.

People read and respond to those comments. You can look like a shmuck in more than one way but dropping a link in your message is about as dumb and shmucky as you can look. It says you’re naive, inexperienced, and don’t know how to build links or how to understand their value.

So, yes, I’ve looked dumb and shmucky on occasion, but if people have some idea of who I am, or if I take the time to say something more than, “You can read my blog at [some web address you've never heard of]” there is a good chance they’ll click on the link under my name.

Traffic is all that matters. Interested, converting traffic that produces regular readers who subscribe to the blog feed, come back to read the articles, and maybe provide links to the articles on their own blogs is what I’m after when I’m being self-promotional. Frankly, when I comment on someone else’s site, my first priority is to add my opinion to the discussion — to provide value to the other site, in as much as I can do so.

Wrap up
There are, of course, other ways to build one-way links. But these methods work very well and they help improve the user experience in many ways. The more value you get from a link, the better. You want more than just PageRank and anchor text. You want traffic, visibility, credibility, and ultimately trust (from the people whose sites link to yours).

The less time I spend building links, the better. If I want links in volume I’ve got options that I don’t say much about. Of course, I’ve mentioned 20 Links A Day more than once, and I think it’s a great program. There are alternatives out there and many people have used them on occasion.

In fact, the more linking resources you acquire, the more effective your link building becomes if, instead of saturating each resource with as many links as you can dump on it, you exercise discretion and balance your use of those resources wisely. You should always think in terms of “I need more links, therefore I need more resources”. As soon as you saturate a link channel you’re done with it.
The decision to burn out a resource is a very short-sighted one. The decision to nurture a resource and make it reusable for a long time is a very powerful one.


One way link Explained
One-way links are links to your site which in return do not receive link from your site.
One-way links gives a strong message to search engines that your site is valuable, so that other sites voluntary speak about your site.

One major advantage of one-way links are that they are easy to get. One-way links directly promote link popularity. Google consider link popularity as a vital factor to determine SERP.

Since there are lot of free ways to get one way links, you can target on relevant sites and get more referral traffic. Google webmaster guide discourages reciprocal links. So its always effective to get one-links as much as possible.


Ways for one way link Building
1-Submit your sites to directories. Try to more focus on major directories like Yahoo & DMOZ.
2-Create website with unique,useful and interesting contents which will give free natural links.
3-Write interesting article and submit them to article publishing sites.
4-Join forums and participate in discussions. Place your URL & anchor text as the signature.Try this site to get forum links.
Get forum links
5-Do blog commenting and place your URL & anchor text. Try to focus on high PR blogs.
6-Give away free ebooks and white papers that contains your URL.
7-Create a blog for your site.
8-Social bookmark your site. Give an option to bookmark your pages.
9-Join and participate in social networking sites and post your URL.

Does Website Structure Matter SEO?

Posted by Extreme on , under | comments (0)



Does Website Structure Matter to Search Engine Optimization?
Does Website structure matter to SEO is a great question that should have been asked about 10 years ago. It’s being answered on many SEO blogs and forums but not in a very formal way, in my opinion.

Website structure helps your search engine optimization in several ways. Let’s take a brief look at how that plays out.

1.Website structure directly influences your site’s crawlability
2.Website structure facilitates or inhibits growth
3.Website structure may improve your on-site optimization
4.Website structure may improve your SERP optimization

Website structure directly influences site crawlability – This point is being made by more people than ever but it seems to be trapped within the thought circles of the SEO community. The difference between a crawlable site and a crawl-inhibiting site structure is most easily measured through cache data freshness. Keep in mind that search engines don’t necessarily recache your page every time they fetch it, but the more often your deep content is recached, the more likely that your site has good crawlability.

Good crawlability ensures that a search engine is most likely to fetch your most important pages more often than others, but also that it’s most likely to fetch a lot of pages rather than just a few. You manage or influence crawl by pointing a lot of links to HTML sitemap pages, embedding links to sibling pages in on-page navigation, and using local hub structures for logical sections of your site.

I generally recommend building at least 3 internal links (from as many different parts of your site) to each page in a complex site.

Website structure facilitates or inhibits growth – If you had to add a new section to your site today, one that is large enough to contain 5 important sub-sections, each loaded with lots of content (pages), what would it take to update your navigation? If you’re thinking, “I’d have to totally rewrite the nav system” then either your navigation has outgrown its usefulness or else you planned poorly for the future.

You should be able to increase the size of your Website by about 30% without having to restructure on-site navigation. This means you won’t severely inhibit crawlability. New sections/content should compensate for their crawl-draining value by adding more internal links to the mix. This is not about passing anchor text to your competitive pages. This is about ensuring that spiders keep finding links to crawl that are deemed important enough to crawl.

If you cannot easily add 30% more content to your site, then you need to start working on a rewrite of the navigation system before you find yourself in crawl crisis mode. It’s broken if you don’t have unused navigational bandwidth, so fix it as soon as possible.

Website structure may improve your on-site navigation – There are four points of optimization that any Web publisher can influence: on-page, on-site, off-site, and SERP. URL structure is an on-site optimization factor, rather than an on-page factor (although most people incorrectly treat it as an on-page factor). Because you may have nested directories, your URL structure may be very complex.

If you believe that search engines pay attention to page URLs (and you should), then your Website structure can be enhanced to help your URLs become more meaningful. You do want to keep them short. You do NOT want to include superfluous sub-directories just for the sake of embedding keywords. You do want to use important, relevant keywords in the page URLs.

Website structure may improve your SERP optimization – I often ask my students, “What is the first thing people see when they look for content on the Web?” Usually no one gets the answer right. It’s the first search results page, not the first Website listed at number 1. What people see on that search results page influences their decision to click through or not click through.

It is possible, for example, to influence people to click on the 2nd result more often than the first, if the 2nd result is clearly more relevant to a query than the 1st result. This happens more often than most people in the SEO industry realize. And keep in mind that “relevance” to a user may be very different from what it seems to be to a search engine.

The user may be looking for a very specific page, and may not be interested in the more algorithmically acceptable content listed above that page. Search engines cannot always deduce what the searchers really want.

Last word on Website structure – There is actually much more to be said about “Website structure” and SEO because, frankly, it goes beyond simple URL construction. You also have to look at page composition factors (are you embedding images, Javascript, iframes, etc.?). And you have to look at how presentation changes from page to page and section to section. Jarring or incongruous transitions may signal some content issues that will impact your search engine optimization.

If you don’t have a consistent page composition structure throughout your site your ability to target keywords and track return on investment for organic SEO is degraded. Uncoordinated page composition usually produces less converting traffic than coordinated page composition.

You need to look at things like percentage breakdowns of page content into boilerplate, injected keywords, original copy, advertising, etc. You also need to look at copy placement, structures used for formatting copy, and even HTML coherency.

Although writing W3C-compliant code is not necessary for search engine optimization, you can ensure your code is not broken by passing compliance tests. I think most people in the industry now agree that if nothing else, writing W3C-compliant code eliminates the headache of tracing broken structures that might inadvertently hide some indexable content from search engine parsers.

So the next time someone asks you, “Does Website structure matter to SEO?”, you’re in a position not only to say yes, but also to explain why and how. And that’s a good thing.

List of Best Tools to Check Your Website Usability

Posted by Extreme on , under | comments (3)



List of tools to test a website

Usability is a qualitative attribute that assesses how easy user interfaces are to use. More impotantly it shows the user behaviour who interact with your webpage first time.

In this post I'm sharing some very useful tools for webmasters to check website usability. Here's the following list:

Feng-GUI

Feng-GUI simulates human vision during the first 5 seconds of exposure to visuals, and creates heatmaps based on an algorithm that predicts
what a real human would be most likely to look at.

This offers designers, advertisers and creatives, a Pre-testing technology that predicts performance of an image, by analyzing levels of attention, brand effectiveness and placement, as well as breaking down the Flow of Attention.

Feng-GUI

Five Second Test

A simple online usability test that helps you identify the most prominent elements of your user interfaces. A central test management screen helps organise and visualise your responses so you can get all the information you need at a glance.

Five Second Test

Loop11

Loop11 is a web-based user-experience testing tool, allowing companies to conduct online, unmoderated user testing on any kind of digital interface. Loop11 is not a survey or web analytics tool, but a user experience tool… helping you to understand user behaviour.

Loop11

CrazyEgg

CrazyEgg is simple - show the hotspots where users click on in a site. This information is not the same as popular pages; instead this is practical information about how and where people click on your site. More importantly, CrazyEgg's approach lets you understand the difference between where you want your users to click and where they are actually clicking.

Traditional site tracking tools offer you a ton of information, including:

:: Popular pages :: Entry pages :: Exit pages :: Came from :: Visitor Paths :: Visit Length

CrazyEgg

Clixpy

Clixpy is a web usability testing tool. It’s very easy to install, just by pasting a few lines of JavaScript code in your site’s HTML. When users browse your website Clixpy traces everything they do and then plays it for you, giving you the opportunity to extract any information you may need.

Clixpy

ClickDensity

More than a Heat Maps ClickDensity is a full usability toolkit. With a unique integrated A/B Test suite, you can trial and analyze improvements at the touch of a button. It takes minutes to set-up. From the second you see the reports, they just make sense. Complements your existing web analytics for an unbeatable analysis of visitor trends.

ClickDensity

Know more about Usability

Share your thoughts and ideas about website usability, leave a comment

Importance of Sitemaps In SEO

Posted by Extreme on , under | comments (4)



Importance of Sitemap
Sitemaps provide a way for Web sites to specify what pages within the site should be indexed and what new content has been added. Basically, it provides a communication channel between the search engine and the site.


A site map (or sitemap) is a graphical representation of the architecture of a web site. It can be either a document in any form used as a planning tool for web design, or a web page that lists the pages on a web site, typically organized in hierarchical fashion. This helps visitors and search engine bots find pages on the site.

A hierarchical diagram of the pages on a Web site, starting with the home page at the top. A site map helps visitors navigate large, complicated sites by showing its entire structure. It is also used as a master diagram of the Web site for Web designers.

A sitemap is an XML file that contains a list of site URLs and related attributes detailing what should be indexed within a specific site. It must be UTF-8 encoded. The following XML elements are required in the sitemap file:

The webmaster can generate a sitemap containing all accessible URLs on the site and submit it to search engines. Since Google, MSN, Yahoo, and Ask use the same protocol now, having a sitemap would let the biggest search engines have the updated pages information.

Benefits of Sitemap:

· Site maps can improve search engine optimization of a site by making sure that all the pages can be found.

· This is especially important if a site uses Macromedia Flash or JavaScript menus that do not include HTML links.

· Most search engines will only follow a finite number of links from a page, so if a site is very large.

· The site map may be required so that search engines and visitors can access all content on the site.



The Importance of Sitemaps from SEO's View
A Sitemap is the representation of websites architecture and contains links to all the pages of a website. A Sitemap can be of two types, each having its own distinct purpose.

HTML Sitemap:
An HTML sitemap is created with keeping a visitor in mind. It helps the visitor in navigating the website along with giving a clear view of a websites sections and categories.

XML Sitemap:
While the HTML Sitemap is designed with keeping visitors in mind, an XML Sitemap is designed specifically for Search Engines. Search Engines send their robots, also called a

Why is a sitemap.xml file important: searchbots to index a website wherein they encounter the sitemap.xml file.

1. The Sitemap simply lists out all the pages of a website in front of the searchbots so that the searchbots know about all the pages in the website and index them.

2. A Sitemap helps the searchbot in distributing Page rank across all the web pages with the help of tag. This ensures that all the inner pages of a website also have a good page rank.

3. The tag of the sitemap.xml file gives the searchbot an indication of how frequently the web page will change. This tells the searchbot to visit the website with the same frequency to index the changes made.

One can either generate a sitemap.xml file manually or there are free sitemap generator tools available online to help generate a sitemap. Though most of the free sitemap generator tools have a limit of up to 500 pages, Paid tools give you the freedom to create a sitemap of unlimited pages.

A Typical sitemap.xml file looks like:
< ?xml version= “1.0” encoding=”UTF-8” ?>
< url>
< loc> http://www.yoursite.com/
< priority>0.9

< changefreq>weekly
< /url>


< url>
< loc>http://www.yoursite.com/about-us.html
< priority>0.8
< changefreq>weekly
< /url>


< url>
< loc>http://www.yoursite.com/services.htm
< priority>0.8
< changefreq>monthly
< /url>


< url>
< loc>http://www.yoursite.com/contact.htm
< priority>0.7
< changefreq>weekly
< /url>

Worst SEO Mistakes You Can Make

Posted by Extreme on , under | comments (0)



Biggest Search Engine Optimization (SEO) Mistakes Expert Make
Search Engine Optimization is a very hot topic in the World Wide Web. After all, everybody wants to rank higher and come up on the first page of Google search and get more traffic. I have identified and made a list of top 15 SEO practices that I tend to forget quite often. These simple SEO techniques if practiced properly can make a significant difference as to how my pages are ranked in the Search Engine Queries.

1.Use rel=”nofollow” tag on low value links to not pass the page rank juice. For example ‘Read the rest of the entry’, ‘About’, ‘Contact’ etc.
2.Use proper anchor text for interlinks. Don’t use ‘here’ or ‘there’.
3.Optimize the images, always create alt tags and write description in the alt tag.
4.Use search engine friendly permalinks. Make sure the URLs do not have ‘&’, ‘?’, ‘!’ etc characters.
5.Use hyphens (-) between words to improve readability.
6.Do not use underscores (_), use hyphens (-) instead.
7.Do not use session id in URLs.
8.Use sticky posts.
9.Use tag clouds.
10.Have a category description paragraph.
11.Let the visitors subscribe to category specific RSS feed. (Use category specific RSS plugin for WordPress)
12.Use internal linking when possible and appropriate.
13.Use sub-directories rather than sub-domains when possible. Sub-domains do not share link love from the main domain as it is treated as a different domain.
14.Research the target audience and aim the site content appropriately.
15.Keep the content up to date. Visitors don’t like outdated content. Updating the content frequently also attracts the Search engines spiders to index the web pages frequently.

Have you identified any SEO mistakes that you commonly make?

Seo Tips for Launching a New Website

Posted by Extreme on , under | comments (1)



Basic Tips For Launching a New Website While Keeping SEO In Mind
Search Engines like Google and Yahoo have become intelligent in the past 4-5 years. It is very difficult to game search engines nowadays by manipulating tags and keyword stuffing. For a new site it may take 6 to 9 months to achieve competitive rankings at Google. From SEO point of view, I recommend the following steps for launching a new website.

i) Registering a domain name with similar Keywords
Pick a domain name which either has keywords associated with your product/service or your brand name. It is better to go for short domain names which are easy to remember. If your target market is global, go for a .com domain. If you are targeting a particular country or language, it is better to register a domain name in TLD of that country. For example, go for a .jp or .co.jp domain, if you are targeting Japan. I recommend you to register the domain for at least 2 years; more the better to gain the trust of search engines.

ii) Domain Hosting
Host your site with a reliable hosting company. Stay away from companies which allow hosting of pharmacy, gambling and adult sites, or have a history of serving spam. If you are targeting a particular country/language, it is advised to host your site in that country to boost your rankings in the local search engines for that country/language.

iii) Registering with Search Engines
Since search engines give important to domain aging and can track parameters like the registration date & first crawled date, you better register your site with search engines as early as possible. Even if your site is not fully ready, it is better to make a temporary one-two page site and register with the search engines at the earliest. Getting indexed in MSN may take only 3-4 days, whereas it may take 2-3 months to get indexed in Google and Yahoo. To speed up this process, you should build incoming links from authority websites. More the links a site receives, the less it is ignored by the search engines. It also reduces the Google sandboxing time for new sites. Submitting your sitemap to Google Webmaster Central also ensures that your site would get crawled and indexed regularly.

iv) Build trust with Search Engines
To build trust with the search engines, you should build your incoming links and content at a regular pace. Keep your content unique and relevant to your products/services. Building links from relevant websites will boost your rankings.

v) Have Web Analytics In Place; Monitor Results
The only way to know whether you are satisfying your goal (and delivering results to your business to make up for the investment!) is to track and monitor your increase in visitors and conversions. As a general rule, if your end result is too design focused, you’ll see fewer leads; If it’s too search focused, you’ll see fewer conversions. With balance in mind, results you will have…your boss will thank you for it!

Building Backlinks through Forum Posting

Posted by Extreme on , under | comments (1)



Backlink Building through Forum Posting / How to Build Backlinks
There is no doubt that quality backlinks from relevant sites are important for SEO success. If a page has good content, other websites will start linking to it in a natural manner; this is the main hypothesis for including backlinks as part of the search engine ranking algorithms. Thus, a page with better content will naturally have more backlinks, and will rank better. However, in practice, one will have go for other ways of gaining relevant quality backlinks, as a part of his SEO strategy.

Getting listed in web directories, article directories, posting in forums, and blogs are some of the acceptable methods of acquiring quality backlinks. Generally, search engines index forums frequently, so posting in forums is a good way to gain quality backlinks with the anchor text of your choice. While submitting articles to directories can be a time consuming process, it does not require much effort and time to post in forums; however, one has to do his homework and take some precautions in order to be successful using forums for gaining backlinks.

You must do some research before joining any forum. You should join forums that are relevant to the sites you want to get backlinks for. You should only choose forums that are popular and active. Backlinks for high authority forums are very valuable. The number of active members and the Google PageRank of the forum can give you a good idea about its popularity. You should keep your signature short and link it to your main website. You should never create posts that sound like a propaganda or which are irrelevant to the post topic. You should pay attention to the TOS of the forum, or you will risk yourself of getting banned and losing all the backlinks you built.

If you post something which is important or useful to other readers, they will visit the site in your signature. Thus, posting in forums can give a significant traffic to your website, apart from building relevant backlinks. Forums are also good for creating your brand or image – Some of the best bloggers, Internet Marketers and SEOs spend a significant time posting on forums to increase their publicity. That is another benefit you may get from forum posting apart from using it for link development strategy.

Changing Title Meta Tag Hurts Google Rankings

Posted by Extreme on , under | comments (2)



Does change of Meta Description affect Google Ranking?
It is no secret that the title meta-tag is one of the most important factors when it comes to on-page search engine optimization. Search engines including Google, Yahoo and MSN give quite a bit of importance to the title and description metatags in their algorithms. What many people, however, do not know that the same title tag could actually hurt your rankings, if you try to change it abruptly with a hope to improve your search engine rankings.

If you are currently ranking well with your page title, you should not risk with changing it to improve your rankings. You may often see your current Google rankings go down with such a change. I myself have experienced negative results when I tried to change the title tag for a site which was already ranking well. In fact, Google slammed the site with a -950 penalty.

Your page title is the main entry gate of search engines. Changing the title meta-tag could easily make your Google rankings fluctuate. If you change your title, Google and other search engines will take some time to reflect rankings for new titles. Especially, it would definitely look suspicious in the eyes of Google and other search engines if you have made the changes in the main title and description tags after a long time.

In short, changing the main title of your site, although it’s already ranking well is not a good idea. Instead of changing the page title to try to have improvements on certain keywords, it is better to work on adding content about those keyword phrases on your page, and add links with appropriate anchor text from sites with relevant content. Google always penalized over-optimization. Now it seems as if it frowns normal search engine optimization too. So the best policy is to optimize the website at the time of creation - Do it once, and do it right !

Google Page Rank Updates : Last Google PR Update

Posted by Extreme on , under | comments (0)



It looks like Google is rolling out a PageRank update. I woke up this morning and some of my sites were getting a new PR.
Are you guys seeing the changes too? Let me know how your PR fluctuated.

It looks like Google is getting a regular schedule for the PageRank updated again though. The last one happened at 29/30 October 2009 , so that is more or less one PR update every three/ four months or so. It would be good if they kept that pace constant.

When’s the Next Google PR update?
Let’s see if in the middle of January/ February 2010 we will get another update.


Dates of Previous Pagerank i.e PR Upadates:
Last Confirmed - 30 October 2009
Confirmed - 27/28 May 2009
Confirmed – 1 / 2 April 2009
Confirmed – 30-31 December 2008
Confirmed – 27 September 2008
Confirmed – 26 July 2008
Confirmed – 29 April 2008
Confirmed – 9,10,11,12 January 2008
Confirmed – 26 October 2007
Confirmed – 28 April 2007

How to Make Mozilla Firefox Load Faster

Posted by Extreme on , under | comments (1)



How to make a Website load faster in Mozilla Firefox

Firefox may run quickly but it loads slowly; here's how to fix it.
(Note : This tip is are for experienced computer users only.)


You can slash Firefox's slow load time by compressing the DLLs and executables. There are many choices for compression but I suggest you use UPX which is free, efficient and time proven.

1. Download UPX from http://upx.sourceforge.net/#download

2. Unzip upx.exe into your Firefox installation folder which is normally C:\Program Files\Mozilla Firefox.

3. Make sure Firefox is not running then shell to a command prompt in the Firefox installation directory.

4. Type in the following command in a single line and hit return:

for %v in (*.exe *.dll components\*.dll plugins\*.dll) do upx "C:\Program Files\Mozilla Firefox\%v"

5. If on some later occasion you want to unpack the files, just type in the command above but add the decompression switch "-d" after "do upx."

That's it; enjoy the difference!

How to Check if Your Website is Penalized by Google

Posted by Extreme on , under | comments (0)



How to Check if a Website is Penalized by Google
There are always a question asked by all webmasters that how to check whether your website is being penalized by Google or not?

Here are few points to let you verify that your website is a victim of Google penalization or not.

Keyword Rankings:

You were top on result for some of your keywords and you see a major and sudden drop in all those keywords. This indicates that your website is probably being penalized by Google because your Google SERP is very down.

Number of Page Index:

Large numbers of pages from your website were there in Google before when check using site: operator and now when you use site: operator, you are seeing very few pages. Most of your pages are no more in Google index. This major de-index also indicate that your website is in trouble.

Page Weight Loss:

If you see that your pages are there in Google when you use site: operator. Next thing to check is the weight of these pages. Even if these pages are there in Google index. They are not appearing in search results or appearing very last in results when you give complete post title plus your blog or website name.

For ex. Secret of Google + GoogleTips, where "Secret of Google" is the keyword you are searching for and "GoogleTips" is the name of your blog or website.

Google Cache:

Google Cache is another way to find if your website is being penalized by Google or not. Normally Google put every website on internet to its cache and this cache updates depend on many factor related to a website. Normally for popular websites like CNN, Washington Post, TechCrunch, you will find the date of last cache is current date or just one day back.

If you check your website with cache: operator in Google and find the date of last cached is too old, it means Google is not showing interest in your website anymore. Most likely it's a penalization.

Complete De-index:

This is the simplest way to check for Google penalization. When your site is completely de-indexed by Google and it's no more in Google either with info: operator or with site: operator.

Number of Backlinks:

Few days back, when you checked your website, Google was showing thousands of backlinks of your website and now link: operator is showing a huge drop in number of backlinks. If this is the case, then it's the possibility of Google penalization.

Index Time:

Index time is another factor by which you can check for penalization. Before penalization, new posts from your website were getting indexed in hours and now they are taking a week to get indexed. This is a clear sign of Google penalization or Google is getting in trouble indexing your website.

Google PageRank:

Google PageRank is a value attached to your website. You may see a drop in your PageRank value if Google has penalized you for any reason. Before few days, if your PageRank value was say 4 and now it’s become 2. It’s very obvious that Google has penalized your website.

Wait for Google Mail:

Google always send mail when it finds something harmful in your website. Google will de-index all those pages which are having problem and will inform you to fix the problem. Once you fix the problem, you can easily come back in Google search index.

Important words used in this article-

link: operator - If used with your website, provide you the number of backlinks to your website.

info: operator - If used with your website, provide you the information of your website in Google.

site: operator - If used with your website, provide you the number of pages indexed in Google search from your website.

What Is Google Caffeine? New Version of Google Search Is Launching Soon

Posted by Extreme on , under | comments (6)



Google Caffeine is an "under the hood" development in the Google search engine algorithm that will augment (slightly) how sites rank on Google's search engine results pages (SERP). The addition of Google Caffeine means that your current website search engine optimization (SEO) may become less effective and you may lose or gain position in search engine results for certain keywords. Google Caffeine is not yet integrated into the standard algorithm, but you can observe where your website and other websites stand in 'Caffeine-induced' results by visiting the Google Caffeine test site, http://www2.sandbox.google.com

Google "Caffeine", the new search engine improves the index size, the speed of the queries and most importantly, changes the value of search engine rankings.

In a post on Webmaster Central Blog, Google notified the world that the next Google search engine was ready for testing.

My first impressions about Google "Caffeine" has been pretty good. Search results in both new and old Google come back lightning quick. I have to take Google's word on the fact that the new search engine is a few milliseconds quicker on almost every search I did (one tie).

I also have to take their word on numbers of results. I am seeing sometimes as many as 10 times the search results in the new Google. I'll assume Google knows how many results it has. Interestingly when I did a search for things like "Online pharmacy", the new Google returned fewer results than the old one. This tells me that the new Google is smarter at finding fake websites and de-indexing them

The results are what makes Google so popular and will be the true test of how good this new engine is. In my tests, the new Google pulls significantly different results than the old Google. For what I was searching for (my name, people I knew, events, computer hardware) the results were significantly better. In fact, it looks like the search keywords have become much bigger a factor than before. I'm seeing smaller sites rise to the top more overall.

Search Engine Optimization (SEO) specialists are going to have to go on a whole new formula for getting their clients to the top...and who knows exactly how this new Google search algorithm works? SEO people are going to have to start nailing down the new rules to Pagerank to keep their customers on top.

Finally, how does it compare to Bing?

Not too bad in my tests. It is certainly faster as well. I've been a Google person for the last 6 years and I am not seeing anything bad in the new Google or good in the new Bing that will change that.

Speaking of making a difference, it would be nice if some developer would add this new search to the Safari/Firefox browser search bars. That would really boost my testing capabilities. Any takers?

Is Google Changing the PageRank Algorithm?

Posted by Extreme on , under | comments (2)



Has Google Changed it's PageRank Algorithm?
The latest news around the blogosphere is that Google PageRanks of large sites have been hurt. Sites penalised are as follows:
Here is a list that I gathered with big blogs that supposedly lost PR on this issue:

■Engadget (from 7 to 5)
■AutoBlog (from 6 to 4)
■Problogger (from 6 to 4)
■Copyblogger (from 6 to 4)
■Search Engine Journal (from 7 to 4)
■Quick Online Tips (from 6 to 3)
■Search Engine Roundtable (from 7 to 4)
■Blog Herald (from 6 to 4)
■Weblog Tools Collection (from 6 to 4)
■JohnTP (from 6 to 4)
■Coolest Gadgets (from 5 to 3)
Update: It looks like mainstream websites that were selling links were also penalized:

■Washington Post (from 7 to 5)
■Washington Times (from 6 to 4)
■Charlotte Observer (from 6 to 4)
■Forbes.com (from 7 to 5)
■SFGate.com (from 7 to 5)
■Sun Times (from 7 to 5)
■New Scientist (from 7 to 5)
■Seattle Times (from 6 to 4)

Andy Beard thought the drop was because of text selling which was reported about a week or so ago. This turns out not to be the case.

Techcrunch reported that Google didn't drop the page ranks because of the selling of text links, but because of link farms. Links farms are where each site in the network provides hundreds of outgoing links on each page of the blog to other blogs in the network, in some cases creating tens, even hundred of thousands of cross links.

This all comes a week after the linking characteristics of Techcrunch was analysed. Where it was reported that 1/3 of all Techcrunch outgoing links where to related Techcrunch sites. Hence, link farms do explain why the Techcrunch page rank hasnt changed, but the Crunchbase ranking is now at 0.

These changes will affect a lot of blog networks that survive on text link ads and related sales that depend on strong Google page ranks. A drop from a PR7 to PR4 should really affect traffic too heavily but it will make the tough job of selling ads much tougher. In the coming months and years I think we will see a lot of small blog networks starting to struggle and trying to find another way to survive.

About 1 months ago now I saw a decrease from a PR3 to a PR2 for one of my site and I found it difficult to work out why, and about a month ago when Google announced that selling text link ads would bring in a punishment, I finally found out why.

So, why the decrease? As I wait to research on the decrease, there are many reasons why Google may be changing it's PageRank algorithm.

Paid Linking : The easy excuse is that they’re targeting paid links, but not all sites which experienced the drop sell or buy links.

Mass Linking : Do we link out to too many sites via Blog Rolls? Does Linkbait just result in TOO MANY links, even if they are natural. Do blog networks use influential linking to their advantage? I think PageRank has been spread too thin and Google is changing its PageRank formula to address the mass publishing which has taken place over the past 2 years.


Devalue PageRank :
PageRank is seen by many as the end all value of a web site. Our PageRank dropped but we are receiving more Google search traffic than ever. PageRank does not define site rankings in Google or traffic and it should not be mistaken as so.

Link Building to Increase Website Traffic

Posted by Extreme on , under | comments (1)



How to Increase Website Traffic
Do you offer free promotions and monthly contests for new members? No matter how many bells & whistles your site has, all of this is virtually worthless if you cannot drive traffic to your website. Building links to your website is one of the smartest things a webmaster can do to establish a solid web presence. You will have direct traffic as a result of people clicking on the links, and indirect traffic from partner sites with higher rankings at the major search engines.

As you might already know, Google uses a system called link popularity and link reputation to determine a site's relevance and position in the search rankings. The mantra stands: the more links you have leading to you on other web pages, the higher your site’s ranking. Link reputation, on the other hand, means how important the incoming links are to your webpage. If you have your website link at a Professional Dog Trade Show site with 200,000 monthly visitors, you will have a higher link reputation than if you had it posted on Sally's Personal Dog Page with 10 visitors a month. In essence, the more traffic your "affiliate" sites have, the more illustrious your link reputation and the higher rankings you will achieve.

Posting your links to any website is not the premise here. Since search engines decipher from the most relevant results, websites with your link must be on the same topic of interest as yours. Suppose Bill Smith searches for "Ultimate Championship Fighting". The first group of relevant sites would be the one with the most links from sites about Ultimate Fighting. Plus, it is not often you see a Woodworking site linking to a Porsche appreciation page. Optimizing your website with the keywords Ultimate Championship Fighting and affiliating with related fighting pages will nail you a lot more traffic, leading to higher link popularity. So go ahead - become affiliated with related websites (preferably the highest ranking ones if you can) and get your link posted on their pages for higher traffic.

Google Caffeine: Google's New Search Engine Index

Posted by Extreme on , under | comments (0)



Is Google Caffeine Faster?
Microsoft has recently unveiled their new search engine, Bing. And with the recent announcement that Microsoft's Bing is going to soon power the Yahoo organic search results, Google needed to do something to keep their market share of search.

Google has unveiled a new test version of their search engine, which is being called "Caffeine". This is being touted as the "next generation of search".

Google's Matt Cutts said, on the Google Webmaster Central blog that they're very interested in feedback:

"Right now, we only want feedback on the differences between Google's current search results and our new system. We're also interested in higher-level feedback ("These types of sites seem to rank better or worse in the new system") in addition to "This specific site should or shouldn't rank for this query." Engineers will be reading the feedback, but we won't have the cycles to send replies."
By letting the public test the new version of Google search (which is noticeably without the Google AdWords ads), Google is able to use the public as their reviewers: and typically Google's best critics will reveal issues that need to be addressed. If you are testing out the new version of Google, and you find a search result that is not to your liking, there is a "Dissatisfied? Help us improve" link at the bottom of the search results page.

What is different between the old version of Google (what we currently see at www.google.com) and this new "Google Caffeine" version of Google? Some are saying that this new version is much faster than the older version of Google. Mashable's conclusion is that "This search is not only faster, but in some instances in our few tests, seems more capable of producing real-time results."

One of the claims of Google Caffeine is that it does a better job at including recent search results. Let's take a look at a recent search phrase, one that Google most likely would not have indexed a few days ago, and look compare the results. One of the "trending topics" on Twitter as I write this is "RIP Eunice Kennedy". So, let's use this keyword phrase as a test.

On Google Caffeine there appears to be a search result that was indexed 25 minutes ago. Google does not typically show a "cached" version of recent search results.

Google Caffeine




















And on the "normal" Google search results, the search results appear to be almost exactly the same:
























In fact, what is interesting is that both versions of Google are currently indexing Twitter statuses from one hour ago or even sooner. I honestly expected that this new Google Caffeine version would be indexing Twitter statuses much faster than one hour ago. So, let's see if this is the case. I searched for this phrase on both Google Caffeine and on the "normal" Google: [site:twitter.com "RIP Eunice Kennedy"].

Google Caffeine is not indexing as many web pages as the "normal" Google search, while the timeliness of the search results appears to be about the same. There are "tweets" from Twitter.com that show up in the Google search results (on both the Caffeine and on Google.com) as quickly as 10 minutes ago. A quick test on another trending topic on Twitter reveals the same results on both, a search for this shows Google is about 21 minutes behind: [site:twitter.com "Social Media Pillows"]

What about comprehensiveness? I have tested many searches with on both Google Caffeine and on Google.com and am not noticing any better indexing (or indexing of more pages) on several websites I have tested. I used the "site:" command on both and found that on some searches Google Caffeine is indexing more pages.

But on other "site:" searches, there are more pages indexed on Google.com. So, I'm not convinced (yet) that this new Google Caffeine is more comprehensive. In fact, the old Google.com has more pages indexed on Twitter (site:twitter.com) than Google Caffeine.

What about relevancy? I tested several search results, including those that included city names and specific "local searches" and I'm not seeing much of a difference at this time. So, the jury is still out: Google Caffeine appears to be faster than "old Google", but the other changes that have been made to Google Caffeine are not really that noticeable.

Lesser Known Method of Blog Promotion

Posted by Extreme on , under | comments (4)



Days ago, when i do walked by on Blogcatalog to find some good news and also to promote my blog. I found this useful thread. It's all about a lesser known method of Blog Promotion. If you interested, read it now!

Written by BlogBadly, taken from one of his discussion on Blog Catalog (with necessary edit):

Whenever someone makes a "SEVEN EASY TIPS FOR BLOG PROMOTION/PR INCREASE" post anywhere, I rarely see adding your posts to blog carnivals.
Blog carnivals are usually held by one blog. In them, the owner reads submissions and decides which ones to put in a post about the carnival. About 5-10 entries are picked for each post. The owners of the carnivals generally don't expect a post back to them - maybe just a trackback - so it's basically a free link to your blog. It can attract readers too. It's not spammy, too, as all of the posts are related in the same category (business, satire, family, pets, etc.)
It's really easy to do: just go to blogcarnival.com, sign up, look for some carnivals that match your blog category, and submit. You can also run your own carnival if you want.
Blog carnivals are actually quite good promotion methods - they won't harm you, at least. Unless you get dizzy on the merry-go-round. Ha. I made a bad joke.
And on a quick note, Newsvine.com (a news site where people tag news/opinion/other articles and post them on their own page) gives a free backlink if you submit your own site (dunno if you guys do that. I don't see any penalties for it). It also appeared on my Technorati blog feedback with an authority of 50-100. It's another option if you have a news blog or just have a few news posts.

Mytheory opinion: Actually, some bloggers have known about this method of blog promotion through blogcarnival, and they like it because it does drive traffic to their blog (if u have great post of course). So, i suggest you to make real good content first if you want to submit it to carnaval then.

Nofollow Links - Advantages & Disadvantages

Posted by Extreme on , under | comments (4)



Nofollow Links - The Good side and the Bad side
Blog commenting is one of the best way of getting backlinks to our sites/blog. And if you frequently gather backlink by that way (blog commenting) you must be familiar with Dofollow or nofollow links attribute. You may be wonder why the webmasters (Matt Cutts and Jason Shellen) design nofollow attributes, because all we know is nofollow link attributes doesn't give our page any reputation (such as PR calculation). But recently i found why nofollow link attributes is so damn important for both web owner or the webmasters who want to gather backlinks from it.

:: Good side / Advantages

1.It prevents comment spamming.
Remember that most of people who want to gather link juice, always do it by leaving poor quality comment(s) to blog which use dofollow attributes, they think it is wasting time to leave a comment on nofollow blog. So, nofollow blog have smaller chance to receive comment spam.
2.You don't want to pass reputation on to a website.
For example you write a post about how people doing spam comment, and you provide link that point to website which offer automatic comment service. Anyhow you don't want to give any reputation to that kind of website, so you can use nofollow attributes to that link.
3.To be Trusted by Major Search Engine
If you pretty smart on putting nofollow attributes to particular link on your blog/website you will be trusted by major search engine. For example you put nofollow tags to website which is not trusted by major search engines, major search engine will consider your blog as a trusted sites. Besides by activate nofollow attributes on particular area on your blog you can avoid comment spamming, and it is good for your web content since search engine doesn't like blog with many comment spam.


:: Bad Side / Disadvantages

1.Less chance to get comment by other blogger
Since most people will only comment on dofollow blog to get some reputation credit to their page, your nofollow blog will be forgotten by them. And it will slightly affect your total number of comments.
2.It is more harder to get any link juice to your websites/blog
Let say you want to gather backlink with blog commenting technique, if you really want to gather reputation credit to your websites, you have to leave it on dofollow blog, thus your task become more harder, besides making high quality comment to be approved by the blog owner, you have to search the real dofollow blog with the same topic with yours.
::. Some suggestions

1.Be smart on placing nofollow attributes to particular links/area
Don't activate nofollow attributes to the whole area of your websites. Try to focus on comment section or particular links which pointing to the website you don't want to give any reputation credits. Also, you can activate dofollow attribute once the comment section of you blog get more crowded or begin to make some discussion between you and the commenter.
2.When leave comment don't bother about nofollow or dofollow attributes
Yea, why i suggest you to not bother about nofollow or dofollow attributes when we leave comment on other blog? Because our backlinks will looked more natural, contains dofollow links and nofollow links as well.

Why Working From Home is Nothing New | Work From Home

Posted by Extreme on , under | comments (3)



Working From Home is Nothing New

Working from home is a whole new way of working — a revolution in industry, in society, in the way we live. Or is it? While making a living by sitting in a cafĂ© with a frappucino and a two-way link to the cloud might be something your parents never dreamed of doing, the idea that you can ignore the corporate world and earn from home is actually about as modern as iron horseshoes and knitting needles. In fact, not only are today’s home-based tech workers more traditional than the average cubicle drone, they actually have a long way to go before their numbers come close to those of the good old days despite recent trends.

According to the US Census Office, the number of people who work at home more than two days a week increased between 1980 and 1990 by 56 percent from 2.2 million to 3.4 million.That’s a remarkable rise and one made all the more impressive by happening before the expansion of the Internet. In the decade following 1990, as communications improved and email replaced memos, the figures increased by a further 22.8 percent to reach 4.2 million people. By 2000, the Census Office reports, 3.3 percent of the working population was able to skip the commute for most of their workweek.


When One in Fourteen Worked from Home

But those are still significantly lower than the numbers in 1960 when almost 4.7 million people were earning their keep from home – a full 7.2 percent of the population. That number halved over the following twenty years, a decline which the Census Office puts down primarily to the closure of family farms and the movement of doctors and other professionals away from home offices and towards large shared practices.

But it wasn’t just the last of the small farmers and home-visiting doctors who were able to call their homes their workspaces in the 1960s. Some of the most important contributions to American culture were being produced in home offices even before the era of free love and one-way commutes to Southeast Asia.

Pay a visit to Frank Lloyd Wright’s home in Oak Park, Illinois, for example, and you’ll be able to see not just the house in which the creator of the Prairie style lived from 1889-1909 but also the office in which he designed 125 of the country’s most important structures. Nor was his own home just a workplace. It was also an architectural laboratory on which he tested his design concepts and theories. Most home workers work in their house. Frank Lloyd Wright’s house was also his work.

That a creative professional like an architect should be able to avoid an office building is perhaps not surprising. Designers, painters, sculptors and other arty types tend to work alone, relying on their own inspiration to deliver their ideas. They rarely need the kinds of equipment that’s best supplied by large office buildings and having secretaries, assistants, sales staff and watercoolers around might even be distracting. Around 40 percent of artists are believed to work from home studios – or at least they do until children come along and claim the studio as their bedroom.

The Web’s Work from Home Industrial Revolution


According to the 1990 census though, almost half of all home workers were in the service industries, which included business and repair work, entertainment and recreation, and “other professional and related services.” By 2000, 1.9 million people were providing “professional services” from home – by far the most popular category – but there were also more than 42,000 people preparing food professionally in their own kitchens and over half a million cutting hair, giving massages and delivering other kinds of personal care. Interestingly, almost 5,000 people in the fishing, hunting and forestry professions worked from home at the start of the millennium too. You have to wonder about the size of their yards.

Even this variety might not be anything new. Perhaps the most important characteristic of the Industrial Revolution was the movement to cities as factories became the shared workspaces of a new urban working class. But what were those new proletarians doing before the opening of the mills and the invention of automated looms that could fill factory floors and lop off children’s fingers? Some, as in early twentieth century America, would have been driving horses on farms but others would have been crafting from home. For women in particular, the loss of hand looms to the spinning jenny meant a shift away from home and family to cotton mills and hard-nosed bosses. For men too, the rise of the assembly line marked the end of the kind of sweating, hammering and hand-crafting of unreliable quality that could be done in a home workshop.

Interestingly, Peter Sweeney, Founder & CTO of semantic technology firm Primal Fusion, has described Web 3.0 as the Internet’s own industrial revolution, a time when the social connections of Web 2.0 gives way to the automated production of content. Wolfram Alpha, he says, is one example of the way in which information can be produced automatically and without the kind of work-at-home handicraft that predated Dickens and which now characterizes the Web’s co-working content producers.

That sounds unlikely. Easy communication is only going to increase the return to home-working and recession hit tech-types who have spent the last few months consulting from home will take some persuading to get back to the traffic jams when the economy does pick up. But today’s home-workers are now primarily tapping keyboards rather than driving tractors. They’re in the cities rather than in the dust fields of Oklahoma (although many of them, like those former agriculturalists, are also now in California). And unlike independent spinners and weavers, they find that they can compete easily with the productivity levels of factory and office-based employees.

Working from home then isn’t a new way of working. It’s a return to an old, traditional – and more enjoyable — way of working, and don’t let the Luddites tell you otherwise.