Best Tips For Google Image Search - How to Index image in Google Image Search

Best Tips For Google Image Search - How to Index image in Google Image Search

How to get Images Indexed by Google Image Search
Google Image Search could be used in many ways.

1. If you want to know if a person is a man or a woman and the name doesn't help, do a search for the name.

2. If you don't know the meaning of a word, the pictures may help you.

3. A better search for Flickr. Google uses information from other sites that link to Flickr photos, so you may find Google search better.

4. Find what's interesting about a site, by looking at the pictures included. For example: wired.com.

5. Find a new wallpaper for your desktop by restricting your search to large images. You can automate this using an application.

6. Find random personal pictures, using standard file names from digital cameras.

7. Type the name of a painter and you can take an art class.

8. Install a Greasemonkey script so you can view the original version of the image directly by clicking on the thumbnail.

9. Find the color of a word. "Word Color is a windows program that uses Google Image Search to determine the color of a word or string of words. It goes out there, retrieves the top 9 images and loops through all pixels, calculating the average hue, which is later converted to a color."

10. If you want to grab search results, GoogleGrab is a tool that downloads images from Google Image Search. It even supports batch search.

Best Tips to Protect Your Personal Email

E-mail and instant messaging (IM) are terrific forms of communication for keeping in touch with family, friends, and business associates. However, using them unwisely may make you and your computer susceptible to spam, phishing scams, viruses, and other online threats. Here are some tips to avoid these problems:

1. Obtain comprehensive security software. Be sure that the security software you select is like McAfee Internet Security and protects you and your PC from viruses, worms, Trojans, unwanted e-mail (spam), phishing scams, and other malicious software. It should also have a firewall like McAfee’s products, which can monitor your Internet connection and stop unwanted traffic to and from your PC. Be sure to keep your security software up-to-date. Ideally, you want it to be like McAfee’s Internet Security Suite that has automatic updates and upgrades.

2. Share your e-mail address with only trusted sources. Only your family, friends, and trusted business contacts should have your personal e-mail address. Do not post your e-mail address on Web sites, forums, or in chat rooms. If you post your e-mail address, you are vulnerable to receiving spam or having your e-mail passed on to others. If you would like to subscribe to a newsletter or Web site and receive confirmation e-mail for online transactions, consider using a generic e-mail address that is not linked to any of your personal information. An example of a generic e-mail address is giraffe@samplee-mail.com.

3. Be careful when opening attachments and downloading files from friends and family or accepting unknown e-mails. You can obtain a virus, worm, or Trojan simply by opening e-mail and attachments, and by accepting files from your friends, family, or others. If you choose to download files, make sure your security software is enabled and pay close attention to any warnings provided.

4. Be smart when using IM programs. If you use an IM program to communicate with friends and family, be careful when sending any personal information. Protect yourself by using a nickname for your IM screen name. Never accept strangers into your IM groups. Be smart about how you use your personal IM at work because your employer may monitor and view your personal messages.

5. Watch out for phishing scams. Phishing scams use fraudulent e-mails and fake Web sites, masquerading as legitimate businesses, to lure unsuspecting users into revealing private account or login information. To be safe, if you receive an e-mail from a business that includes a link to a Web site, make certain that the Web site you visit is legitimate. Instead of clicking through to the Web site from within the e-mail, open a separate Web browser and visit the business’ Web site directly to perform the necessary actions. You can also verify that an e-mail is in fact from a legitimate business by calling the business or agency directly.

6. Use e-mail wisely. E-mail is a great way to keep in touch with friends and family, and as a tool to conduct business. Even if you have good security software on your PC, your friends and family might not have the same protection. Be careful about what information you submit via e-mail. Never send your credit-card information, Social Security number, or other private information via e-mail.


7. Do not reply to spam e-mail. If you don’t recognize the sender, don’t respond. Even replying to spam mail to unsubscribe could set you up for more spam.
8. Create a complex e-mail address. With a complex e-mail address, it makes it more difficult for hackers to auto-generate your e-mail, send spam e-mail, or target your e-mail for other types of attacks. Make sure you come up with an e-mail address that you can easily remember. Try to use letters, numbers, and other characters in a unique combination. Substitute numbers for letters when you can. A sample complex e-mail is: Tracy3Socc3r2@samplee-mail.com.

9. Create smart and strong passwords. Make it difficult for hackers to crack your password. You can create a smart password by incorporating capital letters, numbers, special characters and using more than six characters. An example of a strong password is: Go1dM!n3.


10. Never enter your personal information into a pop-up screen. Sometimes a phisher will direct you to a real organization’s Web site, but then an unauthorized pop-up screen created by the scammer will appear, with blanks in which to provide your personal information. If you fill it in, your information will go to the phisher. Install pop-up blocking software to help prevent this type of phishing attack.

WordPress Ping List : Best List of WordPress Ping URLs

WordPress Ping List : Best List of WP Ping URLs
Welcome to Wordpress Pinglist. We've searched the Internet for the best and most up to date wordpress pinglist sites for use on your blogs. Finding them can be tough, and finding an updated list can be even more work. We've made it easy with our list. Just copy and paste into your wordpress site and you're good to go.

This is a recommended ping list for a WordPress blog. The information is gathered from personal experience and various sources on the web.

Every time you post these services will be notified of your blog post increasing your online exposure.

To use this list, copy and paste this list to your Settings->Writing tab in the admin panel.

http://api.moreover.com/RPC2
http://bblog.com/ping.php
http://blogsearch.google.com/ping/RPC2
http://ping.weblogalot.com/rpc.php
http://ping.feedburner.com
http://ping.syndic8.com/xmlrpc.php
http://ping.bloggers.jp/rpc/
http://rpc.pingomatic.com/
http://rpc.weblogs.com/RPC2
http://rpc.technorati.com/rpc/ping
http://topicexchange.com/RPC2
http://www.blogpeople.net/servlet/weblogUpdates
http://xping.pubsub.com/ping

SEO Smart Links : Essential Wordpress Plugin

SEO Smart Links Plugin for WordPress
SEO Smart Links can automatically link keywords and phrases in your posts and comments with corresponding posts, pages, categories and tags on your blog.

Further SEO Smart links allows you to set up your own keywords and set of matching URLs.

Finally SEO Smart links allows you to set nofollow attribute and open links in new window.

Everything happens completely transparent, and you can edit the options from the administration settings panel.

How it Works?

SEO Smart Links looks for keyword phrases that match the titles of your posts and pages by default (and you can enable categories and tags matching too). These phrases are then turned into the links. The matching is case insensitive and the original case is preserved.

So If I mention Amazing Grace, which is my theme and also the title of one of my pages, it will be automatically converted into a link.

Everything happens completely transparent, and you can edit the options from the administration settings panel.


































Features:

  • Find keywords in your posts, pages and comments and link them to your other posts, pages, categories and tags
  • Full control with customizable options
  • Ignore list for keywords you do not want to link
  • Improves your site's interlinking
  • Control external links with custom keywords
  • Add nofollw attribute or open links in new window
  • Caching for speed - make sure you have define('ENABLE_CACHE', true); set in your wp-config.php

Download



  • Download

    Download Latest Version


    Installation & Usage
    1. Upload the whole plugin folder to your /wp-content/plugins/ folder.
    2. Go to the Plugins page and activate the plugin.
    3. Use the Options page to change your options
    4. That is all.



      How do I correctly use this plugin?

      Just install activate it on your blog. SEO Smart links will be default find matching links to your post and pages (if the keyword in your text matches their title).

      Default options are enough to get you going. If you want to tweak it, you can always edit the options. Be sure to check "ignore" options where you can state what keywords and phrases not to link.

      How do I enable SEO Smart Links cache?

      Make sure you have enabled WordPress cache by adding this line to your wp-config.php

      define('ENABLE_CACHE', true);

      Please be careful when editing this file and always make a backup!

      Changelog

      v2.1
      /> - Performance optimization and new option to link to cats and tags with used at least n times (thanks Domink Deobald!)

      v2.0

      - Added the option for case sensitive matching

      v1.9
      /> - Various improvements and bug fixes

      v1.8.0
      /> - Added support for non-english characters (Unicode)
      /> - Added support for keyword synonyms (in keywords section of the settings screen)

      v1.7
      /> - Performance optimization of the plugin. SEO Smart Links causes much lesser strain on the server.

      v1.6
      /> - Added option to process only single posts and pages

      v1.5
      /> - Added nofollow and new window options

      v1.4
      /> - Added option for custom keywrods. No you can specify unlimited numbers of your keywords and url's to link to

      v1.3:
      /> - Enabled caching for speeding queries up.

      v1.2:
      /> - Added limits options
      /> - Fixed small bugs

How to Import/Export a Blog to Blogger

How to Import/Export a Blog to Blogspot

New Feature: Import and Export in Blogspot
Today’s release brings another long-desired feature to Blogger: Import and Export of blogs. Now you can export all of your posts and comments into a single, Atom-formatted XML file for easy backup. You can then import the posts back into Blogger, either into an existing blog or into a new one.













To export your blog, log in to http://draft.blogger.com/ and go to the Settings > Basic page. You’ll see the Blog Tools links at the top of the page for importing and exporting. (We also moved blog deletion up here from the bottom of the page. Don’t worry about accidentally clicking it, though; your blog wouldn’t be deleted until you confirmed on the next page.)

Once you click “Export blog” and press the “Export” button on the next page, your browser will prompt you to save the XML file for your blog. Keep it somewhere safe as a backup, or import it into a different blog. You can import one blog into another from the Blog Tools links, or when creating a new blog. Look for the “Advanced Options” at the bottom of the page.
















When you import a blog, all of the posts will get saved in an “imported” state. From there you can publish just a few, or all of them at once. Here are some ideas for what you can do with importing and exporting:

•Merge two or more blogs into one. Take the exported posts and comments from one blog and import them into another one.
•Move individual posts from blog to blog. After importing, select just a set of posts to publish and publish them with one click.
•Back up your blog to your own storage. You can keep your words safe and under your control in case anything happens to your blog, or us, or if you want to remove them from the Internet.
•Move your blog somewhere else. Our export format is standard Atom XML. We hope to see other blogging providers extend their Atom support to include import and export. And, if you decide to come back to Blogger, importing your export file will get you back up and running in seconds.
Caveats

•The export format currently only covers blog posts and comments to those posts, not blog settings or templates. To back up a Classic template, copy and paste the template code from the editor. To back up a Layouts template, use the Backup / Restore template option to download a copy of your template.
•Before importing a blog for the first time, we recommend that you create a new, throwaway blog to import into so you get a sense for how the process works. Once you’re comfortable, import into your public blog.
•At the moment there is a 1MB size limit on the blog you can import. This is a bug that we are correcting the issue.

Have you imported or exported your blog? Let us know about how it went in the comments.

Get Targeted Backlinks to your blog with Comment Kahuna

Every blogger, unless the blog is exclusively setup for private and limited purposes, is interested in gaining more quality traffic and better blog ranking in search engines. One of the widely recognized approaches to improve the position and attract more visitors is receiving quality backlinks. However, while task sounds simple and self-explanatory, the realization step is quite complicated. How can you encourage another blogger, preferably with good Google PageRank standing, to post link on you blog?



Comment Kahuna is one of the best free tools on the market, that will assist you in reaching your goal. It is a simple, but efficient utility that helps you in searching blogs related to the niche of your website. Use it to offer informative comments on the blog posts. This will give your website quality backlinks that search engines looks for while evaluating your blog position. Many of these blogs get lots of visitors, so if your comment is interesting, relevant and informative, some will follow the link to your website.


I checked multiple reviews on the web for the user comments, and most of them are super positive, so try for yourself. It is absolutely free, but the registration on site is required.


Website: http://commentkahuna.com

Disadvantages of Reciprocal Links Exchange

What is Reciprocal Links Exchange

Reciprocal linking is an agreement between two web site owners to provide a hyperlink within their own web site to your web site. Generally this is done to provide readers with quick access to related sites, or to show a partnership between two sites. Reciprocal links can also help to increase traffic to your web site in two ways. Firstly, you will probably have some viewers visit your site from clicking the reciprocal link directly. Secondly, most Internet search engines take into account the number of web sites which contain links to your web site; the more hyperlinks to your site found, the higher up in the search engine rankings (depending on the search term) you'll find your site.
Reciprocal linking between web sites became an important part of the search engine optimisation process thanks to the link popularity algorithm PageRank employed by Google, which ranks web sites for relevancy dependent on the number of links that led to a particular page and the anchor text of the link.

For years people have said you need to submit to the search engines. With the advances in search engine spider technology, it is no longer a requirement to submit to the search engines. Rather it is a requirement to submit to topical directories. Reciprocal links and topical directories help you move forward and open the doors for more people and search engine spiders to find your site.

Three-way linking
Three-way linking (site A → site B → site C → site A) is also a kind of reciprocal linking but a special one. This type of linking is becoming increasingly popular since some search engines give less consideration to the value of normal reciprocal links (site A → site B → site A). Some search engines give a higher value to web sites that link to another web site without that web site displaying a return link, three way linking fits this critera while also offering a reciprocal return link from say a sister site.
The attempt of this link building method is to create more "natural" links in the eyes of search engines. The value of links by three-way linking can then be better than normal reciprocal links, which are usually done between two domains.

Automated Linking
In order to take advantage of the need for inbound links to rank well in the search engines, a number of automatic link exchange services have been launched. Members of these schemes will typically agree to have several links added to all their web pages in return for getting similar links back from other sites.

Link Exchange
An alternative to the automated linking above is a Link Exchange forum, in which members will advertise the sites that they want to get links to, and will in turn offer reciprocal or three way links back to the sites that link to them. The links generated through such services are subject to editorial review.



Disadvantages of Reciprocal Link Building

I would like to list below some disadvantages of reciprocal links exchange.

1. Time and effort is higher than one-way link building practices.
2. If we exchange links with bad neighbor, our site may get penalized.
3. Good neighbor may become bad in future and put us in trouble.
4. Link exchange partner may remove our link without our knowledge while we still link back.
5. Search engines discourage link exchange.

List of Best Tools to Check Your Website Usability

List of tools to test a website

Usability is a qualitative attribute that assesses how easy user interfaces are to use. More impotantly it shows the user behaviour who interact with your webpage first time.

In this post I'm sharing some very useful tools for webmasters to check website usability. Here's the following list:

Feng-GUI

Feng-GUI simulates human vision during the first 5 seconds of exposure to visuals, and creates heatmaps based on an algorithm that predicts
what a real human would be most likely to look at.

This offers designers, advertisers and creatives, a Pre-testing technology that predicts performance of an image, by analyzing levels of attention, brand effectiveness and placement, as well as breaking down the Flow of Attention.

Feng-GUI

Five Second Test

A simple online usability test that helps you identify the most prominent elements of your user interfaces. A central test management screen helps organise and visualise your responses so you can get all the information you need at a glance.

Five Second Test

Loop11

Loop11 is a web-based user-experience testing tool, allowing companies to conduct online, unmoderated user testing on any kind of digital interface. Loop11 is not a survey or web analytics tool, but a user experience tool… helping you to understand user behaviour.

Loop11

CrazyEgg

CrazyEgg is simple - show the hotspots where users click on in a site. This information is not the same as popular pages; instead this is practical information about how and where people click on your site. More importantly, CrazyEgg's approach lets you understand the difference between where you want your users to click and where they are actually clicking.

Traditional site tracking tools offer you a ton of information, including:

:: Popular pages :: Entry pages :: Exit pages :: Came from :: Visitor Paths :: Visit Length

CrazyEgg

Clixpy

Clixpy is a web usability testing tool. It’s very easy to install, just by pasting a few lines of JavaScript code in your site’s HTML. When users browse your website Clixpy traces everything they do and then plays it for you, giving you the opportunity to extract any information you may need.

Clixpy

ClickDensity

More than a Heat Maps ClickDensity is a full usability toolkit. With a unique integrated A/B Test suite, you can trial and analyze improvements at the touch of a button. It takes minutes to set-up. From the second you see the reports, they just make sense. Complements your existing web analytics for an unbeatable analysis of visitor trends.

ClickDensity

Know more about Usability

Share your thoughts and ideas about website usability, leave a comment

Importance of Sitemaps In SEO

Importance of Sitemap in Search Engine Optimization

Sitemaps provide a way for Web sites to specify what pages within the site should be indexed and what new content has been added. Basically, it provides a communication channel between the search engine and the site. 

 A site map (or sitemap) is a graphical representation of the architecture of a web site. It can be either a document in any form used as a planning tool for web design, or a web page that lists the pages on a web site, typically organized in hierarchical fashion. This helps visitors and search engine bots find pages on the site. 
 A hierarchical diagram of the pages on a Web site, starting with the home page at the top. A site map helps visitors navigate large, complicated sites by showing its entire structure. It is also used as a master diagram of the Web site for Web designers. 

 A sitemap is an XML file that contains a list of site URLs and related attributes detailing what should be indexed within a specific site. It must be UTF-8 encoded. The following XML elements are required in the sitemap file: The webmaster can generate a sitemap containing all accessible URLs on the site and submit it to search engines. Since Google, MSN, Yahoo, and Ask use the same protocol now, having a sitemap would let the biggest search engines have the updated pages information. 

Benefits of Sitemap: 

 · Site maps can improve search engine optimization of a site by making sure that all the pages can be found. 
 · This is especially important if a site uses Macromedia Flash or JavaScript menus that do not include HTML links. 
 · Most search engines will only follow a finite number of links from a page, so if a site is very large. 
 · The site map may be required so that search engines and visitors can access all content on the site.

   

 The Importance of Sitemaps From SEO's View: 

A Sitemap is the representation of websites architecture and contains links to all the pages of a website. A Sitemap can be of two types, each having its own distinct purpose. 

 HTML Sitemap: 
 An HTML sitemap is created with keeping a visitor in mind. It helps the visitor in navigating the website along with giving a clear view of a websites sections and categories. 
 
XML Sitemap: 
While the HTML Sitemap is designed with keeping visitors in mind, an XML Sitemap is designed specifically for Search Engines. Search Engines send their robots, also called a 
 
Why Is A Sitemap.xml File Important

Searchbots to index a website wherein they encounter the sitemap.xml file. 

1. The Sitemap simply lists out all the pages of a website in front of the searchbots so that the searchbots know about all the pages in the website and index them. 

2. A Sitemap helps the searchbot in distributing Page rank across all the web pages with the help of tag. This ensures that all the inner pages of a website also have a good page rank. 

3. The tag of the sitemap.xml file gives the searchbot an indication of how frequently the web page will change. This tells the searchbot to visit the website with the same frequency to index the changes made. 

One can either generate a sitemap.xml file manually or there are free sitemap generator tools available online to help generate a sitemap. Though most of the free sitemap generator tools have a limit of up to 500 pages, Paid tools give you the freedom to create a sitemap of unlimited pages. 


A Typical Sitemap.xml File Looks Like: 

< ?xml version= “1.0” encoding=”UTF-8” ?> < url> < loc> http://www.yoursite.com/ < priority>0.9 < changefreq>weekly < /url>

 < url> < loc>http://www.yoursite.com/about-us.html < priority>0.8 < changefreq>weekly < /url> < url>

 < loc>http://www.yoursite.com/services.htm < priority>0.8 < changefreq>monthly < /url> 

 < url> < loc>http://www.yoursite.com/contact.htm < priority>0.7 < changefreq>weekly < /url>


Worst SEO Mistakes You Can Make

Biggest Search Engine Optimization (SEO) Mistakes Expert Make
Search Engine Optimization is a very hot topic in the World Wide Web. After all, everybody wants to rank higher and come up on the first page of Google search and get more traffic. I have identified and made a list of top 15 SEO practices that I tend to forget quite often. These simple SEO techniques if practiced properly can make a significant difference as to how my pages are ranked in the Search Engine Queries.

1.Use rel=”nofollow” tag on low value links to not pass the page rank juice. For example ‘Read the rest of the entry’, ‘About’, ‘Contact’ etc.
2.Use proper anchor text for interlinks. Don’t use ‘here’ or ‘there’.
3.Optimize the images, always create alt tags and write description in the alt tag.
4.Use search engine friendly permalinks. Make sure the URLs do not have ‘&’, ‘?’, ‘!’ etc characters.
5.Use hyphens (-) between words to improve readability.
6.Do not use underscores (_), use hyphens (-) instead.
7.Do not use session id in URLs.
8.Use sticky posts.
9.Use tag clouds.
10.Have a category description paragraph.
11.Let the visitors subscribe to category specific RSS feed. (Use category specific RSS plugin for WordPress)
12.Use internal linking when possible and appropriate.
13.Use sub-directories rather than sub-domains when possible. Sub-domains do not share link love from the main domain as it is treated as a different domain.
14.Research the target audience and aim the site content appropriately.
15.Keep the content up to date. Visitors don’t like outdated content. Updating the content frequently also attracts the Search engines spiders to index the web pages frequently.

Have you identified any SEO mistakes that you commonly make?

Seo Tips for Launching a New Website

Basic Tips For Launching a New Website While Keeping SEO In Mind
Search Engines like Google and Yahoo have become intelligent in the past 4-5 years. It is very difficult to game search engines nowadays by manipulating tags and keyword stuffing. For a new site it may take 6 to 9 months to achieve competitive rankings at Google. From SEO point of view, I recommend the following steps for launching a new website.

i) Registering a domain name with similar Keywords
Pick a domain name which either has keywords associated with your product/service or your brand name. It is better to go for short domain names which are easy to remember. If your target market is global, go for a .com domain. If you are targeting a particular country or language, it is better to register a domain name in TLD of that country. For example, go for a .jp or .co.jp domain, if you are targeting Japan. I recommend you to register the domain for at least 2 years; more the better to gain the trust of search engines.

ii) Domain Hosting
Host your site with a reliable hosting company. Stay away from companies which allow hosting of pharmacy, gambling and adult sites, or have a history of serving spam. If you are targeting a particular country/language, it is advised to host your site in that country to boost your rankings in the local search engines for that country/language.

iii) Registering with Search Engines
Since search engines give important to domain aging and can track parameters like the registration date & first crawled date, you better register your site with search engines as early as possible. Even if your site is not fully ready, it is better to make a temporary one-two page site and register with the search engines at the earliest. Getting indexed in MSN may take only 3-4 days, whereas it may take 2-3 months to get indexed in Google and Yahoo. To speed up this process, you should build incoming links from authority websites. More the links a site receives, the less it is ignored by the search engines. It also reduces the Google sandboxing time for new sites. Submitting your sitemap to Google Webmaster Central also ensures that your site would get crawled and indexed regularly.

iv) Build trust with Search Engines
To build trust with the search engines, you should build your incoming links and content at a regular pace. Keep your content unique and relevant to your products/services. Building links from relevant websites will boost your rankings.

v) Have Web Analytics In Place; Monitor Results
The only way to know whether you are satisfying your goal (and delivering results to your business to make up for the investment!) is to track and monitor your increase in visitors and conversions. As a general rule, if your end result is too design focused, you’ll see fewer leads; If it’s too search focused, you’ll see fewer conversions. With balance in mind, results you will have…your boss will thank you for it!

Building Backlinks through Forum Posting

Backlink Building through Forum Posting / How to Build Backlinks
There is no doubt that quality backlinks from relevant sites are important for SEO success. If a page has good content, other websites will start linking to it in a natural manner; this is the main hypothesis for including backlinks as part of the search engine ranking algorithms. Thus, a page with better content will naturally have more backlinks, and will rank better. However, in practice, one will have go for other ways of gaining relevant quality backlinks, as a part of his SEO strategy.

Getting listed in web directories, article directories, posting in forums, and blogs are some of the acceptable methods of acquiring quality backlinks. Generally, search engines index forums frequently, so posting in forums is a good way to gain quality backlinks with the anchor text of your choice. While submitting articles to directories can be a time consuming process, it does not require much effort and time to post in forums; however, one has to do his homework and take some precautions in order to be successful using forums for gaining backlinks.

You must do some research before joining any forum. You should join forums that are relevant to the sites you want to get backlinks for. You should only choose forums that are popular and active. Backlinks for high authority forums are very valuable. The number of active members and the Google PageRank of the forum can give you a good idea about its popularity. You should keep your signature short and link it to your main website. You should never create posts that sound like a propaganda or which are irrelevant to the post topic. You should pay attention to the TOS of the forum, or you will risk yourself of getting banned and losing all the backlinks you built.

If you post something which is important or useful to other readers, they will visit the site in your signature. Thus, posting in forums can give a significant traffic to your website, apart from building relevant backlinks. Forums are also good for creating your brand or image – Some of the best bloggers, Internet Marketers and SEOs spend a significant time posting on forums to increase their publicity. That is another benefit you may get from forum posting apart from using it for link development strategy.

Changing Title Meta Tag Hurts Google Rankings

Does change of Meta Description affect Google Ranking?
It is no secret that the title meta-tag is one of the most important factors when it comes to on-page search engine optimization. Search engines including Google, Yahoo and MSN give quite a bit of importance to the title and description metatags in their algorithms. What many people, however, do not know that the same title tag could actually hurt your rankings, if you try to change it abruptly with a hope to improve your search engine rankings.

If you are currently ranking well with your page title, you should not risk with changing it to improve your rankings. You may often see your current Google rankings go down with such a change. I myself have experienced negative results when I tried to change the title tag for a site which was already ranking well. In fact, Google slammed the site with a -950 penalty.

Your page title is the main entry gate of search engines. Changing the title meta-tag could easily make your Google rankings fluctuate. If you change your title, Google and other search engines will take some time to reflect rankings for new titles. Especially, it would definitely look suspicious in the eyes of Google and other search engines if you have made the changes in the main title and description tags after a long time.

In short, changing the main title of your site, although it’s already ranking well is not a good idea. Instead of changing the page title to try to have improvements on certain keywords, it is better to work on adding content about those keyword phrases on your page, and add links with appropriate anchor text from sites with relevant content. Google always penalized over-optimization. Now it seems as if it frowns normal search engine optimization too. So the best policy is to optimize the website at the time of creation - Do it once, and do it right !

Google Page Rank Updates : Last Google PR Update

It looks like Google is rolling out a PageRank update. I woke up this morning and some of my sites were getting a new PR.
Are you guys seeing the changes too? Let me know how your PR fluctuated.

It looks like Google is getting a regular schedule for the PageRank updated again though. The last one happened at 29/30 October 2009 , so that is more or less one PR update every three/ four months or so. It would be good if they kept that pace constant.

When’s the Next Google PR update?
Let’s see if in the middle of January/ February 2010 we will get another update.


Dates of Previous Pagerank i.e PR Upadates:
Last Confirmed - 30 October 2009
Confirmed - 27/28 May 2009
Confirmed – 1 / 2 April 2009
Confirmed – 30-31 December 2008
Confirmed – 27 September 2008
Confirmed – 26 July 2008
Confirmed – 29 April 2008
Confirmed – 9,10,11,12 January 2008
Confirmed – 26 October 2007
Confirmed – 28 April 2007

How to Make Mozilla Firefox Load Faster

How to make a Website load faster in Mozilla Firefox

Firefox may run quickly but it loads slowly; here's how to fix it.
(Note : This tip is are for experienced computer users only.)


You can slash Firefox's slow load time by compressing the DLLs and executables. There are many choices for compression but I suggest you use UPX which is free, efficient and time proven.

1. Download UPX from http://upx.sourceforge.net/#download

2. Unzip upx.exe into your Firefox installation folder which is normally C:\Program Files\Mozilla Firefox.

3. Make sure Firefox is not running then shell to a command prompt in the Firefox installation directory.

4. Type in the following command in a single line and hit return:

for %v in (*.exe *.dll components\*.dll plugins\*.dll) do upx "C:\Program Files\Mozilla Firefox\%v"

5. If on some later occasion you want to unpack the files, just type in the command above but add the decompression switch "-d" after "do upx."

That's it; enjoy the difference!

How to Check if Your Website is Penalized by Google

How to Check if a Website is Penalized by Google
There are always a question asked by all webmasters that how to check whether your website is being penalized by Google or not?

Here are few points to let you verify that your website is a victim of Google penalization or not.

Keyword Rankings:

You were top on result for some of your keywords and you see a major and sudden drop in all those keywords. This indicates that your website is probably being penalized by Google because your Google SERP is very down.

Number of Page Index:

Large numbers of pages from your website were there in Google before when check using site: operator and now when you use site: operator, you are seeing very few pages. Most of your pages are no more in Google index. This major de-index also indicate that your website is in trouble.

Page Weight Loss:

If you see that your pages are there in Google when you use site: operator. Next thing to check is the weight of these pages. Even if these pages are there in Google index. They are not appearing in search results or appearing very last in results when you give complete post title plus your blog or website name.

For ex. Secret of Google + GoogleTips, where "Secret of Google" is the keyword you are searching for and "GoogleTips" is the name of your blog or website.

Google Cache:

Google Cache is another way to find if your website is being penalized by Google or not. Normally Google put every website on internet to its cache and this cache updates depend on many factor related to a website. Normally for popular websites like CNN, Washington Post, TechCrunch, you will find the date of last cache is current date or just one day back.

If you check your website with cache: operator in Google and find the date of last cached is too old, it means Google is not showing interest in your website anymore. Most likely it's a penalization.

Complete De-index:

This is the simplest way to check for Google penalization. When your site is completely de-indexed by Google and it's no more in Google either with info: operator or with site: operator.

Number of Backlinks:

Few days back, when you checked your website, Google was showing thousands of backlinks of your website and now link: operator is showing a huge drop in number of backlinks. If this is the case, then it's the possibility of Google penalization.

Index Time:

Index time is another factor by which you can check for penalization. Before penalization, new posts from your website were getting indexed in hours and now they are taking a week to get indexed. This is a clear sign of Google penalization or Google is getting in trouble indexing your website.

Google PageRank:

Google PageRank is a value attached to your website. You may see a drop in your PageRank value if Google has penalized you for any reason. Before few days, if your PageRank value was say 4 and now it’s become 2. It’s very obvious that Google has penalized your website.

Wait for Google Mail:

Google always send mail when it finds something harmful in your website. Google will de-index all those pages which are having problem and will inform you to fix the problem. Once you fix the problem, you can easily come back in Google search index.

Important words used in this article-

link: operator - If used with your website, provide you the number of backlinks to your website.

info: operator - If used with your website, provide you the information of your website in Google.

site: operator - If used with your website, provide you the number of pages indexed in Google search from your website.

What Is Google Caffeine? New Version of Google Search Is Launching Soon

Google Caffeine is an "under the hood" development in the Google search engine algorithm that will augment (slightly) how sites rank on Google's search engine results pages (SERP). The addition of Google Caffeine means that your current website search engine optimization (SEO) may become less effective and you may lose or gain position in search engine results for certain keywords. Google Caffeine is not yet integrated into the standard algorithm, but you can observe where your website and other websites stand in 'Caffeine-induced' results by visiting the Google Caffeine test site, http://www2.sandbox.google.com

Google "Caffeine", the new search engine improves the index size, the speed of the queries and most importantly, changes the value of search engine rankings.

In a post on Webmaster Central Blog, Google notified the world that the next Google search engine was ready for testing.

My first impressions about Google "Caffeine" has been pretty good. Search results in both new and old Google come back lightning quick. I have to take Google's word on the fact that the new search engine is a few milliseconds quicker on almost every search I did (one tie).

I also have to take their word on numbers of results. I am seeing sometimes as many as 10 times the search results in the new Google. I'll assume Google knows how many results it has. Interestingly when I did a search for things like "Online pharmacy", the new Google returned fewer results than the old one. This tells me that the new Google is smarter at finding fake websites and de-indexing them

The results are what makes Google so popular and will be the true test of how good this new engine is. In my tests, the new Google pulls significantly different results than the old Google. For what I was searching for (my name, people I knew, events, computer hardware) the results were significantly better. In fact, it looks like the search keywords have become much bigger a factor than before. I'm seeing smaller sites rise to the top more overall.

Search Engine Optimization (SEO) specialists are going to have to go on a whole new formula for getting their clients to the top...and who knows exactly how this new Google search algorithm works? SEO people are going to have to start nailing down the new rules to Pagerank to keep their customers on top.

Finally, how does it compare to Bing?

Not too bad in my tests. It is certainly faster as well. I've been a Google person for the last 6 years and I am not seeing anything bad in the new Google or good in the new Bing that will change that.

Speaking of making a difference, it would be nice if some developer would add this new search to the Safari/Firefox browser search bars. That would really boost my testing capabilities. Any takers?

Is Google Changing the PageRank Algorithm?

Has Google Changed it's PageRank Algorithm?
The latest news around the blogosphere is that Google PageRanks of large sites have been hurt. Sites penalised are as follows:
Here is a list that I gathered with big blogs that supposedly lost PR on this issue:

■Engadget (from 7 to 5)
■AutoBlog (from 6 to 4)
■Problogger (from 6 to 4)
■Copyblogger (from 6 to 4)
■Search Engine Journal (from 7 to 4)
■Quick Online Tips (from 6 to 3)
■Search Engine Roundtable (from 7 to 4)
■Blog Herald (from 6 to 4)
■Weblog Tools Collection (from 6 to 4)
■JohnTP (from 6 to 4)
■Coolest Gadgets (from 5 to 3)
Update: It looks like mainstream websites that were selling links were also penalized:

■Washington Post (from 7 to 5)
■Washington Times (from 6 to 4)
■Charlotte Observer (from 6 to 4)
■Forbes.com (from 7 to 5)
■SFGate.com (from 7 to 5)
■Sun Times (from 7 to 5)
■New Scientist (from 7 to 5)
■Seattle Times (from 6 to 4)

Andy Beard thought the drop was because of text selling which was reported about a week or so ago. This turns out not to be the case.

Techcrunch reported that Google didn't drop the page ranks because of the selling of text links, but because of link farms. Links farms are where each site in the network provides hundreds of outgoing links on each page of the blog to other blogs in the network, in some cases creating tens, even hundred of thousands of cross links.

This all comes a week after the linking characteristics of Techcrunch was analysed. Where it was reported that 1/3 of all Techcrunch outgoing links where to related Techcrunch sites. Hence, link farms do explain why the Techcrunch page rank hasnt changed, but the Crunchbase ranking is now at 0.

These changes will affect a lot of blog networks that survive on text link ads and related sales that depend on strong Google page ranks. A drop from a PR7 to PR4 should really affect traffic too heavily but it will make the tough job of selling ads much tougher. In the coming months and years I think we will see a lot of small blog networks starting to struggle and trying to find another way to survive.

About 1 months ago now I saw a decrease from a PR3 to a PR2 for one of my site and I found it difficult to work out why, and about a month ago when Google announced that selling text link ads would bring in a punishment, I finally found out why.

So, why the decrease? As I wait to research on the decrease, there are many reasons why Google may be changing it's PageRank algorithm.

Paid Linking : The easy excuse is that they’re targeting paid links, but not all sites which experienced the drop sell or buy links.

Mass Linking : Do we link out to too many sites via Blog Rolls? Does Linkbait just result in TOO MANY links, even if they are natural. Do blog networks use influential linking to their advantage? I think PageRank has been spread too thin and Google is changing its PageRank formula to address the mass publishing which has taken place over the past 2 years.


Devalue PageRank :
PageRank is seen by many as the end all value of a web site. Our PageRank dropped but we are receiving more Google search traffic than ever. PageRank does not define site rankings in Google or traffic and it should not be mistaken as so.

Link Building to Increase Website Traffic

How to Increase Website Traffic
Do you offer free promotions and monthly contests for new members? No matter how many bells & whistles your site has, all of this is virtually worthless if you cannot drive traffic to your website. Building links to your website is one of the smartest things a webmaster can do to establish a solid web presence. You will have direct traffic as a result of people clicking on the links, and indirect traffic from partner sites with higher rankings at the major search engines.

As you might already know, Google uses a system called link popularity and link reputation to determine a site's relevance and position in the search rankings. The mantra stands: the more links you have leading to you on other web pages, the higher your site’s ranking. Link reputation, on the other hand, means how important the incoming links are to your webpage. If you have your website link at a Professional Dog Trade Show site with 200,000 monthly visitors, you will have a higher link reputation than if you had it posted on Sally's Personal Dog Page with 10 visitors a month. In essence, the more traffic your "affiliate" sites have, the more illustrious your link reputation and the higher rankings you will achieve.

Posting your links to any website is not the premise here. Since search engines decipher from the most relevant results, websites with your link must be on the same topic of interest as yours. Suppose Bill Smith searches for "Ultimate Championship Fighting". The first group of relevant sites would be the one with the most links from sites about Ultimate Fighting. Plus, it is not often you see a Woodworking site linking to a Porsche appreciation page. Optimizing your website with the keywords Ultimate Championship Fighting and affiliating with related fighting pages will nail you a lot more traffic, leading to higher link popularity. So go ahead - become affiliated with related websites (preferably the highest ranking ones if you can) and get your link posted on their pages for higher traffic.

Google Caffeine: Google's New Search Engine Index

Is Google Caffeine Faster?
Microsoft has recently unveiled their new search engine, Bing. And with the recent announcement that Microsoft's Bing is going to soon power the Yahoo organic search results, Google needed to do something to keep their market share of search.

Google has unveiled a new test version of their search engine, which is being called "Caffeine". This is being touted as the "next generation of search".

Google's Matt Cutts said, on the Google Webmaster Central blog that they're very interested in feedback:

"Right now, we only want feedback on the differences between Google's current search results and our new system. We're also interested in higher-level feedback ("These types of sites seem to rank better or worse in the new system") in addition to "This specific site should or shouldn't rank for this query." Engineers will be reading the feedback, but we won't have the cycles to send replies."
By letting the public test the new version of Google search (which is noticeably without the Google AdWords ads), Google is able to use the public as their reviewers: and typically Google's best critics will reveal issues that need to be addressed. If you are testing out the new version of Google, and you find a search result that is not to your liking, there is a "Dissatisfied? Help us improve" link at the bottom of the search results page.

What is different between the old version of Google (what we currently see at www.google.com) and this new "Google Caffeine" version of Google? Some are saying that this new version is much faster than the older version of Google. Mashable's conclusion is that "This search is not only faster, but in some instances in our few tests, seems more capable of producing real-time results."

One of the claims of Google Caffeine is that it does a better job at including recent search results. Let's take a look at a recent search phrase, one that Google most likely would not have indexed a few days ago, and look compare the results. One of the "trending topics" on Twitter as I write this is "RIP Eunice Kennedy". So, let's use this keyword phrase as a test.

On Google Caffeine there appears to be a search result that was indexed 25 minutes ago. Google does not typically show a "cached" version of recent search results.

Google Caffeine




















And on the "normal" Google search results, the search results appear to be almost exactly the same:
























In fact, what is interesting is that both versions of Google are currently indexing Twitter statuses from one hour ago or even sooner. I honestly expected that this new Google Caffeine version would be indexing Twitter statuses much faster than one hour ago. So, let's see if this is the case. I searched for this phrase on both Google Caffeine and on the "normal" Google: [site:twitter.com "RIP Eunice Kennedy"].

Google Caffeine is not indexing as many web pages as the "normal" Google search, while the timeliness of the search results appears to be about the same. There are "tweets" from Twitter.com that show up in the Google search results (on both the Caffeine and on Google.com) as quickly as 10 minutes ago. A quick test on another trending topic on Twitter reveals the same results on both, a search for this shows Google is about 21 minutes behind: [site:twitter.com "Social Media Pillows"]

What about comprehensiveness? I have tested many searches with on both Google Caffeine and on Google.com and am not noticing any better indexing (or indexing of more pages) on several websites I have tested. I used the "site:" command on both and found that on some searches Google Caffeine is indexing more pages.

But on other "site:" searches, there are more pages indexed on Google.com. So, I'm not convinced (yet) that this new Google Caffeine is more comprehensive. In fact, the old Google.com has more pages indexed on Twitter (site:twitter.com) than Google Caffeine.

What about relevancy? I tested several search results, including those that included city names and specific "local searches" and I'm not seeing much of a difference at this time. So, the jury is still out: Google Caffeine appears to be faster than "old Google", but the other changes that have been made to Google Caffeine are not really that noticeable.

Lesser Known Method of Blog Promotion

Days ago, when i do walked by on Blogcatalog to find some good news and also to promote my blog. I found this useful thread. It's all about a lesser known method of Blog Promotion. If you interested, read it now!

Written by BlogBadly, taken from one of his discussion on Blog Catalog (with necessary edit):

Whenever someone makes a "SEVEN EASY TIPS FOR BLOG PROMOTION/PR INCREASE" post anywhere, I rarely see adding your posts to blog carnivals.
Blog carnivals are usually held by one blog. In them, the owner reads submissions and decides which ones to put in a post about the carnival. About 5-10 entries are picked for each post. The owners of the carnivals generally don't expect a post back to them - maybe just a trackback - so it's basically a free link to your blog. It can attract readers too. It's not spammy, too, as all of the posts are related in the same category (business, satire, family, pets, etc.)
It's really easy to do: just go to blogcarnival.com, sign up, look for some carnivals that match your blog category, and submit. You can also run your own carnival if you want.
Blog carnivals are actually quite good promotion methods - they won't harm you, at least. Unless you get dizzy on the merry-go-round. Ha. I made a bad joke.
And on a quick note, Newsvine.com (a news site where people tag news/opinion/other articles and post them on their own page) gives a free backlink if you submit your own site (dunno if you guys do that. I don't see any penalties for it). It also appeared on my Technorati blog feedback with an authority of 50-100. It's another option if you have a news blog or just have a few news posts.

Mytheory opinion: Actually, some bloggers have known about this method of blog promotion through blogcarnival, and they like it because it does drive traffic to their blog (if u have great post of course). So, i suggest you to make real good content first if you want to submit it to carnaval then.

Nofollow Links - Advantages & Disadvantages

Nofollow Links - The Good side and the Bad side
Blog commenting is one of the best way of getting backlinks to our sites/blog. And if you frequently gather backlink by that way (blog commenting) you must be familiar with Dofollow or nofollow links attribute. You may be wonder why the webmasters (Matt Cutts and Jason Shellen) design nofollow attributes, because all we know is nofollow link attributes doesn't give our page any reputation (such as PR calculation). But recently i found why nofollow link attributes is so damn important for both web owner or the webmasters who want to gather backlinks from it.

:: Good side / Advantages

1.It prevents comment spamming.
Remember that most of people who want to gather link juice, always do it by leaving poor quality comment(s) to blog which use dofollow attributes, they think it is wasting time to leave a comment on nofollow blog. So, nofollow blog have smaller chance to receive comment spam.
2.You don't want to pass reputation on to a website.
For example you write a post about how people doing spam comment, and you provide link that point to website which offer automatic comment service. Anyhow you don't want to give any reputation to that kind of website, so you can use nofollow attributes to that link.
3.To be Trusted by Major Search Engine
If you pretty smart on putting nofollow attributes to particular link on your blog/website you will be trusted by major search engine. For example you put nofollow tags to website which is not trusted by major search engines, major search engine will consider your blog as a trusted sites. Besides by activate nofollow attributes on particular area on your blog you can avoid comment spamming, and it is good for your web content since search engine doesn't like blog with many comment spam.


:: Bad Side / Disadvantages

1.Less chance to get comment by other blogger
Since most people will only comment on dofollow blog to get some reputation credit to their page, your nofollow blog will be forgotten by them. And it will slightly affect your total number of comments.
2.It is more harder to get any link juice to your websites/blog
Let say you want to gather backlink with blog commenting technique, if you really want to gather reputation credit to your websites, you have to leave it on dofollow blog, thus your task become more harder, besides making high quality comment to be approved by the blog owner, you have to search the real dofollow blog with the same topic with yours.
::. Some suggestions

1.Be smart on placing nofollow attributes to particular links/area
Don't activate nofollow attributes to the whole area of your websites. Try to focus on comment section or particular links which pointing to the website you don't want to give any reputation credits. Also, you can activate dofollow attribute once the comment section of you blog get more crowded or begin to make some discussion between you and the commenter.
2.When leave comment don't bother about nofollow or dofollow attributes
Yea, why i suggest you to not bother about nofollow or dofollow attributes when we leave comment on other blog? Because our backlinks will looked more natural, contains dofollow links and nofollow links as well.

Why Working From Home is Nothing New | Work From Home

Working From Home is Nothing New

Working from home is a whole new way of working — a revolution in industry, in society, in the way we live. Or is it? While making a living by sitting in a café with a frappucino and a two-way link to the cloud might be something your parents never dreamed of doing, the idea that you can ignore the corporate world and earn from home is actually about as modern as iron horseshoes and knitting needles. In fact, not only are today’s home-based tech workers more traditional than the average cubicle drone, they actually have a long way to go before their numbers come close to those of the good old days despite recent trends.

According to the US Census Office, the number of people who work at home more than two days a week increased between 1980 and 1990 by 56 percent from 2.2 million to 3.4 million.That’s a remarkable rise and one made all the more impressive by happening before the expansion of the Internet. In the decade following 1990, as communications improved and email replaced memos, the figures increased by a further 22.8 percent to reach 4.2 million people. By 2000, the Census Office reports, 3.3 percent of the working population was able to skip the commute for most of their workweek.


When One in Fourteen Worked from Home

But those are still significantly lower than the numbers in 1960 when almost 4.7 million people were earning their keep from home – a full 7.2 percent of the population. That number halved over the following twenty years, a decline which the Census Office puts down primarily to the closure of family farms and the movement of doctors and other professionals away from home offices and towards large shared practices.

But it wasn’t just the last of the small farmers and home-visiting doctors who were able to call their homes their workspaces in the 1960s. Some of the most important contributions to American culture were being produced in home offices even before the era of free love and one-way commutes to Southeast Asia.

Pay a visit to Frank Lloyd Wright’s home in Oak Park, Illinois, for example, and you’ll be able to see not just the house in which the creator of the Prairie style lived from 1889-1909 but also the office in which he designed 125 of the country’s most important structures. Nor was his own home just a workplace. It was also an architectural laboratory on which he tested his design concepts and theories. Most home workers work in their house. Frank Lloyd Wright’s house was also his work.

That a creative professional like an architect should be able to avoid an office building is perhaps not surprising. Designers, painters, sculptors and other arty types tend to work alone, relying on their own inspiration to deliver their ideas. They rarely need the kinds of equipment that’s best supplied by large office buildings and having secretaries, assistants, sales staff and watercoolers around might even be distracting. Around 40 percent of artists are believed to work from home studios – or at least they do until children come along and claim the studio as their bedroom.

The Web’s Work from Home Industrial Revolution


According to the 1990 census though, almost half of all home workers were in the service industries, which included business and repair work, entertainment and recreation, and “other professional and related services.” By 2000, 1.9 million people were providing “professional services” from home – by far the most popular category – but there were also more than 42,000 people preparing food professionally in their own kitchens and over half a million cutting hair, giving massages and delivering other kinds of personal care. Interestingly, almost 5,000 people in the fishing, hunting and forestry professions worked from home at the start of the millennium too. You have to wonder about the size of their yards.

Even this variety might not be anything new. Perhaps the most important characteristic of the Industrial Revolution was the movement to cities as factories became the shared workspaces of a new urban working class. But what were those new proletarians doing before the opening of the mills and the invention of automated looms that could fill factory floors and lop off children’s fingers? Some, as in early twentieth century America, would have been driving horses on farms but others would have been crafting from home. For women in particular, the loss of hand looms to the spinning jenny meant a shift away from home and family to cotton mills and hard-nosed bosses. For men too, the rise of the assembly line marked the end of the kind of sweating, hammering and hand-crafting of unreliable quality that could be done in a home workshop.

Interestingly, Peter Sweeney, Founder & CTO of semantic technology firm Primal Fusion, has described Web 3.0 as the Internet’s own industrial revolution, a time when the social connections of Web 2.0 gives way to the automated production of content. Wolfram Alpha, he says, is one example of the way in which information can be produced automatically and without the kind of work-at-home handicraft that predated Dickens and which now characterizes the Web’s co-working content producers.

That sounds unlikely. Easy communication is only going to increase the return to home-working and recession hit tech-types who have spent the last few months consulting from home will take some persuading to get back to the traffic jams when the economy does pick up. But today’s home-workers are now primarily tapping keyboards rather than driving tractors. They’re in the cities rather than in the dust fields of Oklahoma (although many of them, like those former agriculturalists, are also now in California). And unlike independent spinners and weavers, they find that they can compete easily with the productivity levels of factory and office-based employees.

Working from home then isn’t a new way of working. It’s a return to an old, traditional – and more enjoyable — way of working, and don’t let the Luddites tell you otherwise.

Which URL Shortening Service is Best to Use

URL Shortening Services/ Top 10 URL Shortening Services
URL shortening services have taken strong root in the internet community. It seems that there is a lot of bias in which URL shortening service is the weapon of choice. For instance- Twitter users commonly use TinyUrl for their shortening needs. But TinyUrl is far from being the only URL shortening service out there. In fact, each service has its own very special service to offer; so don’t take sides just yet.

Doiop.com
Doiop brings something interesting to the table: you can pick your own URL. Unlike many other services that pick a random URL for you, this feature will let you remember the shortened URL. If you’ve ever tried to remember URLs of other services- you know what we mean.


SnipURL.com
SnipURL has the ability to change the URL to something more memorable- a big plus. What we liked about SnipURL was the fact that it provides traffic statistics on the newly created URL. A private key can also be created to limit access to the URL. Finally, there is just something about the “Snipped! 92% of original url,” line that makes the experience very satisfying.

DwarfURL.com
DwarfURL is an elegant yet simple AJAX application. Like SnipURL, it enables traffic statistics for submitted URLs. What really made it shine, however, was the fact that a Mozilla Firefox Add-on can be installed to use their application on the fly. Clearly, this application was optimized for speed and efficiency buffs.

MemURL.com
MemURL is the classic URL shortening service. It doesn’t bring a lot to the table, but it does get the job done. But don’t discredit it just yet- what really sets it apart from other services is its naming convention. MemURL automatically creates human-readable URLs. Sure, they aren’t real words, but they sure are fun to pronounce! Services like this will likely take over when other services that allow users to edit URLs become flooded. This alternative is much better than trying to guess a URL no one else has taken.

TinyURL.com
At long last, we get to Twitter’s dominant URL shortening service. TinyURL has the same kind of shortening service everyone else does, but it packs a punch: it includes a preview URL! How many times have you clicked on a shortened URL, only to find that it led you to a virus, disgusting picture, or web prank? With the preview functionality, that will never happen again!

TraceURL.com
Out of all the URL shortening services available, TraceURL is easily one of the most advanced. It features a highly advanced user interface that makes the process incredibly simple. Sadly, you have to register before you can use the service (which is free). It supports tracking statistics, the ability to choose your URL, and features a simple pop-up interface. If you need options, TraceURL is your shortening service.

URLTea.com
At first glance, URLTea.com is just another URL shortening service. If you’ll look at the above image more carefully, you’ll notice something more is going on behind the scenes. When you submit a URL to URLTea, it will automatically copy the new URL to your clipboard! This makes the copy and paste process a thing of the past, and we aren’t sad to see it go. Just make sure you don’t have anything important in clipboard you’d like to save before using URLTea.

NotLong.com
NotLong is yet another URL shortening service, of which offers statistics, password protection, and the ability to chose your own URL. It’s fast and simple; what more could you ask for in a URL shortening service?

XAddr.com
XAddr is an interesting URL shortening service. They offer the basics, and not much else. What they do throw into the mix, however, is extra security options much like TinyURL does. When the shortened link to a URL is clicked, you are instead brought to a landing page- not the direct URL. XAddr then gives you options to check McAfee SiteAdvisor for more information on the URL. If security is an issue, XAddr is a great alternative to TinyURL.

Azqq.com
Last but not least, we have AZQQ.com. We like this shortening service because it doesn’t do anything extra- just shorten URLs. Sometimes it isn’t about how many features you can pack into a URL shortening service, but how fast the service works. If you just want the bare bones of URL shortening, AZQQ.com is your man.


Please scream loudly in the comments if you want to propose a different/better solution. Is usage/popularity or functionality the best metric for this kind of decision? Should we have "guest" URL shortening services based on unique functionality?

First space hotel to open by 2012‎

First space hotel
space hotel
A private space tourism company, Galactic Suite Limited, has announced that they will be opening the first space hotel in 2012

The company is offering a three-night stay for 3 million euros ($4.4 million).

The zero-g resort will orbit the Earth at 30,000 mph, completely circling the planet once every 80 minutes, while offering visitors 15 sunrises per day.

Galactic Suite plans to transport its travelers to space via Russian rockets from a spaceport to be built on a Caribbean island.

The trip to the hotel will take one day and a half and guests will undergo an eight-week training course in the Caribbean prior to launch.

Currently, the GS Project is still in design phase. The project consists of the spaceport, the ship, and the orbiting space resort.

The hotel is expected to be constructed of connecting pods around a central hub. Each pod will be able to hold four guests and two astronaut-pilots.

Galactic Suite CEO Xavier Claramunt, says “an anonymous billionaire space enthusiast” has granted $3 billion to finance the project.

Galactic Suites claim that more than 200 people have already inquired about staying at the hotel, with 43 of them actually putting in their reservations.

Top 6 Ways How to Check Your Website Backlinks

How to Check Websites Backlinks / Check PR / Check Internal Pagerank / PR Checking tools

1. Google Webmaster Console (http://www.google.com/webmasters)- absolute leader for checking backlinks. Webmaster console is probably the most accurate source for backlink checking. The big downside is that you can only check backlinks for your own websites. Also, some people say that its yet another way for Google how to get more information about your site - if you are doing something dodgy, be careful.

2. Yahoo SiteExplorer (http://siteexplorer.search.yahoo.com)- if you don't have access to the webmaster console then Yahoo is your tool. Its not as accurate as GWC but still good enough. You can see up to 1000 backlinks for every domain but results include nofollow links as well. You can also export backlinks to TSV file (like CSV but separated with tabs), but only a current page.

3. Google Allinanchor command - This is not very known trick. Using allinanchor command with your website name you can get quite a lot of backlinks for your site. The advantage is that using that method you can get links which were not reported by google link command or yahoo siteexplorer e.g.

allinanchor:www.yourwebsite.com or allinanchor: yourwebsite

The disadvantage is that if your website name is a generic phrase then you will get lots of results which are not links to your site.


4. SEO Elite, Backlink Analyzer etc. (http://tools.seobook.com/backlink-analyzer) , (http://www.seoelite.com) - These tools can help you with digging up backlinks from search engines in an easier way. You don't have to manually check every page in Google/Yahoo as this tool compiles a report with backlinks along with other useful information like Google Pagerank, Alexa rating etc..

5. Alternative search engines - There are other search engines that can give you information about backlinks:
MSN - uses link operator like G or Y - link:http://www.yourwebsite.com
Exalead - uses link operator like G or Y - link:http://www.yourwebsite.com
These search engines can give you sometime a backlink which is not reported by Yahoo SiteExplorer, but mostly its waste of time.

6. Google alerts (http://www.google.com/alerts) - This is a less known one. You can set Google Alerts on your domain like - http://www.yourwebsite.com and Google will send you a n email when the link is found somewhere on the web. This is a cool way how to check for new links but Google will report also pages where only your website is mentioned but no real link is present. The disadvantage of this method is that Google Alerts are not as accurate as they could be, sometimes emails are not being sent, even Google have indexed a page with the link.


Best Pagerank Checking Tools
http://www.smartpagerank.com/pagerank-backlinks.php
http://check-backlink.com/cgi-bin/bl_checker.pl

Have I missed something? Any suggestions/corrections are welcome.

What Is PageRank Leakage? Is Your Website Leaking PageRank?

What Is PR Leakage? Is Your Website Leaking PR?

Improving and maintaining PageRank is a top priority for webmasters because it affects how much traffic a website will receive from Google. But what many webmasters do not realize is that their websites are leaking PageRank, causing their overall site-wide PageRank to be lower than it should be.

What Is PageRank Leakage?
PageRank leakage is when outbound links to external websites give away pagerank that could be better distributed among pages within the website. The effect of pagerank leakage is most profound when a website has many links to external sites, and when internal pages are not well linked.

How To Prevent PageRank Leakage
There are two main ways to prevent PageRank leakage:

1. Eliminate many outbound links
Many websites and blogs have footers and sidebars which contain outbound links. The footer, for example, might contain links to the company that developed the website or to the content management system on which it was built. And the sidebar often contains a blogroll.

The problem is that these links are typically displayed on every page of the website, causing PageRank to exit and adding no value to the page.

To fix this problem, you can remove all of these redundant links and place them in all in one single page. This solution is very popular because it enables webmasters to share a limited amount of PageRank, while not dramatically contributing to the website’s overall PageRank leakage.

2. Use Nofollow Links
Another way to limit PageRank leakage is to completely stop links from passing PageRank. This is easily done by adding nofollow to links, as shown in the example below.

Example
This method is very effective, however, many webmasters are reluctant to add nofollow to links because it is often considered selfish to avoid passing PageRank to legitimate websites. And you also don’t want to completely stop linking to websites because there is evidence that Google does take your outbound links into consideration when ranking your website. But you should consider using nofollow when linking to advertisers and sponsors, and in any links that you do leave in your footer or sidebar.

What About PageRank Distribution within Your Website?
You can also use these techniques to control PageRank distribution within your website. If you have some preferred content that you want to rank well, you can prioritize it by linking liberally to it from your other pages. Additionally, if you want to avoid wasting PageRank on a page, you can opt to use nofollow links when linking internally.

In the Google Webmasters/Site owners help, Google calls this crawl prioritization and gives the following example:

Search engine robots can’t sign in or register as a member on your forum, so there’s no reason to invite Googlebot to follow "register here" or "sign in" links. Using nofollow on these links enables Googlebot to crawl other pages you’d prefer to see in Google’s index.

How To Install CommentLuv On Your Blogspot Blog

How To Enable CommentLuv Pugin On Blogspot Blog

I took the skeleton insructions from here for easier reference and cross navigation for you. You can have better guidelines on what to do to install CommentLuv on your blog by seeing instructions there, and switch here if you want more detailed explanations..


The first thing to remember is, CommentLuv is a Java-based script, and thus you have to allow Javascript to run on your browser. Don't be like me, activating NoScript on my Firefox and was wondering why CommentLuv didn't work. :P My instructions, once again, is a modified version of instructions posted on commentluv.com. This instructions have been tested thoroughly and confirmed working. I recommend you to follow these instructions step by step so there won't be any mistakes happened.

Step 1
Register your site and verify it
This is a simple process. Don't tell me you don't know how to register a new account on a website.... -.- Go here-> http://www.commentluv.com/wp-login.php?action=register to register new account on commentluv.com.

Fill out the necessary fields, and continue. You will be told that a confirmation email has been sent to your mailbox. Check it and click link embedded in it to activate your account.

Next, you have to add commentluv tag to verify your site. Go here: http://www.commentluv.com/settings/ , and fill out your site address. You will have to put html code (below badges display) like provided on this pic, on your blog. Just add a new page element (Javascript) and paste the whole code there.























Once you have inserted the code, click 'verify' to claim your site. The verifying phase now is done.

Step 2
Import your comments and convert your blogger template
Go here: http://js-kit.com/comments/blogger.cgi .



















Again, it's a javascript based website, so disable your NoScript if you have it installed in Firefox/ add an exception for this website.

Fill out the fields on 'Step 1: Importing Your Comments.' Blog URL should be filled with your blog's URL, Ex: eternalblackzero.blogspot.com, Account should be filled with YOUR BLOGSPOT EMAIL ADDRESS that you use regularly to login to your blog. And finally, Password is your blogspot email address password. Click import button and wait, confirmation screen should appear in a minute or so, depending on your connection speed.

Then, please follow instructions posted on 'Step 2' carefully.

If you have followed everything correctly, after you had finished 'Step 2' on js-kit.com, your blog's comment form should've been changed by now. Please refresh your browser if you haven't seen the difference.


Step 3
View the commentluv code
Go back to commentluv.com to get your code now. The code is located here: http://www.commentluv.com/download/blogger-commentluv/ .





















You HAVE to click 'Click for your code' and small text box will appear there. Copy the whole text and switch to your blogspot control panel now.

Step 4
Install to your blogger template

Add one page element (Javascript/HTML) . I suggest you to leave title field blank, so the widget won't clutter your blog's appearance. The script should be stay invisible as it adds functionality/design on your comment form only, not enhance your blog's general look and feel.





















Paste the code there on a 'content' field.

And there you go, you've successfully installed commentluv on your blog. ^^b But there is a thing you need to know. Here it is,

Step 5
Your js-kit settings
Now go to your comment form, and click "powered by jskit" hyperlink. It should take you on this page:

Here you can change many settings for your commentluv widget. I personally didn't change anything (before I removed my widget to experiment it on my other test blog) and was quite satisfied with commentluv.

If you'd like to uninstall commentluv, just upload your original template that was backed up during js-kit installation. Just restore your original template and blogspot will delete the commentluv widget automatically. Easy, huh? ^^

And that's the end of my tutorial. I hope you're able to make a good use of this great widget. If you have anything you'd like to speak, post it on my comment form here.

List Of Cool Blogs Using CommentLuv Plugin

  • Nihars World

  • Shout Me Loud

  • Domain Marvelous

  • Blog Solute

  • Letssermo

  • Blogussion

  • Teenius

  • Serradinho

  • Web Journey

  • i Blog Master

  • AnimHut

  • MyBlog2Day

  • The Marketing Park

  • Technically Personal

  • Quick Online Tips

  • Asnio

  • Dollar Shower

  • Better Blogging For Bloggers

  • Chidimar

  • Blogging Diary

  • The Anand

  • Cats Who Code

  • Guide To Tech

  • Amit Bhawani

  • Tech Zoomin

  • Pallab

  • Jacob Yap

  • Tech 18

  • Tech Genuine

  • Techie Buzz

  • Dat Money

  • Tricks Daddy

  • Shanker Bakshi

  • Dish Tracking

  • Softpoint

  • Techie Blogger

  • Technobuz

  • My Technology Guide

  • PSD Recipes

  • Novice Bloggers


  • If you own a blog or know a blog which has Comment Luv plugin and if you like it to be displayed here, then please mention in the comments with some details and i'll add them into this list!

    Rankings of Best SEO Companies in Mumbai (Exclusive List) - 2019

    List of Top 10 Best SEO Companies in Mumbai (Bombay) Maharashtra. Want to know which SEO companies in Mumbai are giving their clients th...