Simple SEO guide for Companies in Sri Lanka – Part one

search-optimize-your-blog

The power of Search Engine Optimization is not yet unleashed in Sri Lanka. Besides the few high tech entrepreneurs and modern business owners who are practicing SEO, a large proportion of businessmen and companies are not benefiting from SEO in Sri Lanka.

If you take my company for example, we have a strong marketing consulting arm, and we do corporate training, marketing planning, market research, marketing communication, etc. So far, we have not advertised nor done any awareness program but the company totally depends on the search behavior. We always find that all the inquiries are very effective since it comes from companies who need it, or individuals looking for consulting advices and entrepreneur looking for business ideas.

For example, do a search for corporate training in Sri Lanka, marketing consulting in Sri Lanka and you will find the epitom.org website appearing on the top.

I thought of sharing some valuable insights to take your website up for search keyword and results.

All in all, to get the best search advantage, you should have a good website. Let us first take you through the technical audit part. The most important features are:

  1. Create a webmaster central account (Google webmaster central)
  2. Create a Google analytics account

Once it has been done, now let’s open the site to do a simple technical audit because well structured websites will perform well in SEO. We are not going to cover any web design or new development element. For example, if you have simple site and want to implement proper SEO, we will be covering how to do a simple effective SEO audit, the aspects to consider and the implementation of those.

First you have to do a website technical audit. When you make sure your technical side of the web is good, it will be a good indication in the improvement on ranking.

How you are going to perform a technical audit? The content of technical audit goes as follows.

1. Website Crawlability 

  • Site Search and Indexed Pages
  • Cached Pages in Google
  • XML Sitemap
  • HTML Sitemap
  • Correct Use of Robots.txt File
  • Internal Linking Structure

2. Internal SEO Health

  • Page Titles
  • Use of H1, H2 Tags
  • Meta Descriptions
  • Availability of Unique Content on Pages
  • Site Structure and Navigation
  • URL Structure
  • Breadcrumbs
  • Duplicate Content
  • Content quality and content strategy
  • Pagination
  • Site Errors /Server Errors
  • Correct Use of 404 Pages
  • Use of www and Sub Domains
  • Image Optimisation

3. Link Popularity and Link Profile

  • Inbound Links Count
  • Link Juice and its Distribution
  • Link Diversity
  • Anchor Text Profile

Website craw-lability check

1- Site search and indexed pages

Let’s start with the website craw-lability check. How you check whether your website is being identified by Google? Simply, go to Google Chrome and type site:youwebsite url. This will show you how many pages have been indexed in Google. To know more about what is indexing read https://support.google.com/customsearch/answer/4513925?hl=en

Take two situations; if it’s indexed, there are no issues and simply you have work on increasing the index numbers. If it is not, and if your site is too old or out there for sometime, then that is certainly an issue.

There are three situations in this Robot Text.

1- <META NAME=”ROBOTS” CONTENT=”NOINDEX, FOLLOW”> – Search engines can read your website, but will not list your web pages in search.

<META NAME=”ROBOTS” CONTENT=”INDEX, NOFOLLOW”> – Google bots are not allowed to come in. 

<META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”> – Both not allowed.

Sometimes when you are developing a content management website (WordPress / Joomla) the theme would have automatically been set to No index / no follow tags. If you didn’t see anything indexed, use http://www.seoreviewtools.com/bulk-meta-robots-checker/ to see whether the pages are index and followed.

Let’s see the detailed implementation of these in the next post.

We have a large investment company as my client, and they had some good presence in the search. But soon after they re-launched the site, the presence was not as profound as before. When we checked the site, the meta robot stated “No index, no follow”. Once we changed the meta robot, all the pages were back on track.

2- Cached pages 

Cached pages indicates how your webpage looked like for Google when it visited last time. To know more about it, you can read here https://support.google.com/websearch/answer/1687222?hl=en

You can use this information to check. Google can only read the text of the website and not the entire user interface. To see how the site looks like, go to the Google search results page. Type your domain name and click on the arrow. This will give you the following information:

1- How Google reads your site (Click text only version).

2- Date and time Google bots last visited your page

3- How the web site looked like when Google bots last visited.

3- XML site maps 

XML sitemaps are just one tool that can help content creators to establish their stake as the content originator. XML site map gives directions for the Google bots about the site structure. Even without a sitemap, Google can still find a website, but with a map, they can get through more efficiently and make sure to look at all of your pages.

Check whether you have an XML site map installed by trying yourdomainname.com/sitemap.xml.

Similarly, HTML site map helps the site visitors to easily navigate the site.

4-Correct Use of Robots.txt File

People who own websites use the /robots.txt file in order to give directions and information regarding their website to web robots. This is known as The Robots Exclusion Protocol.

This is how it works. For example, if a robot visits a website URL, such as http://www.example.com/welcome.html, then before the robot visits the website, it will check for http://www.example.com/robots.txt first. This is what it will find:

User-agent: *

Disallow: /

The “User-agent: *” indicates that this section pertains to all robots. On the other hand, the “Disallow: /” relates to the robot that it shouldn’t visit the pages on that specific website. Read More about this 

5- Internal Linking structure

 

Different web practitioners have different terms for this, but internal linking is the term that is well understood by the SEO community. In general terms, internal linking refers to any links from one web page on a domain which leads to another web page on that same domain. This can refer to the main site navigation or the links within articles to related content. In this article, we will focus more on the latter- the editorial links within articles because it is a more commonplace SEO tactic that is controlled by the site’s editors and writers as opposed to a tech team. Read more about this

 

About /robots.txt – Search Engine Optimisation Sri Lanka

People who own websites use the /robots.txt file in order to give directions and information regarding their website to web robots. This is known as The Robots Exclusion Protocol.

This is how it works. For example, if a robot visits a website URL, such as http://www.example.com/welcome.html, then before the robot visits the website, it will check for http://www.example.com/robots.txt first. This is what it will find:

User-agent: *

Disallow: /

The “User-agent: *” indicates that this section pertains to all robots. On the other hand, the “Disallow: /” relates to the robot that it shouldn’t visit the pages on that specific website.

There are two very essential points to consider when using the /robots.txt. It is outlined as below:

  • Robots can sometimes pay no attention to your /robots.txt. This is specifically if malware robots run a thorough scan of the website for precautionary reasons or if the email address was used by spammers. If that is the case, then the robots will ignore your website altogether.
  • The other instance is when the /robots.txt file is a file that is openly accessible. This means that anyone can gain access and see whatever sections of your server that you do not want robots to use.

All in all, you should not try to use /robots.txt in order to conceal any information.

Details about /robots.txt

The /robots.txt is an authentic standard that is not in possession of any standard organization. The historical descriptions are illustrated below:

Additionally, there are exterior resources as well:

It should be noted that the /robots.txt standard is not developed actively.

How to make a /robots.txt file

To put it shortly, the /robot.txt file is created in the top-level index of your website server.

To elaborate further on this, when a robot searches for the “/robots.txt” file for URL, it breaks down the path constituent from the URL (which includes everything from the very first single slash), and then puts the “/robots.txt” in its position.

For instance, in the “http://www.example.com/shop/index.html”, the robot will take away the “/shop/index.html”, and substitute it with “/robots.txt”. This will result in “http://www.example.com/robots.txt&#8221;.

Bearing all these in mind, if you own a web site, then you need to place it in the correct place on your web server for the resultant URL to actually work properly. Generally, that is the similar place in which you place your website’s main “index.html” landing page. To know where precisely that is and how to actually put the file in that place, it all comes down to your web server software.

Keep in mind to use all lower cases for the following filename: “robots.txt”, not “Robots.TXT.

What should you put in it?

As a text file, the “/robots.txt” comes with one or more records which typically consists of a particular record such as this:

User-agent: *

Disallow: /cgi-bin/

Disallow: /tmp/

Disallow: /~joe/

Three directories are barred in this particular case.

It’s also important to remember that you would require a different “Disallow” line for each and every URL prefix that you would like to keep out because you will not be able to simply say “Disallow: /cgi-bin/ /tmp/” in just one line. Additionally, you might not have empty lines in a record because these are used in order to restrict several records.

At the same time, it’s important to note that globbing and customary terms are not maintained in the User-agent or Disallow lines. The ‘*’ in the User-agent field holds a very unique value of meaning to any robot. Distinctively, you can’t afford to have lines such as “User-agent: *bot*”, “Disallow: /tmp/*” or “Disallow: *.gif”.

Typically, whatever you should leave out depends heavily on your server. Everything that is not clearly prohibited is taken to mean a fair game. Following are some examples:

To eliminate all robots from the complete server

User-agent: *

Disallow: /

To permit total access all robots 

User-agent: *

Disallow:

(Or simply build an empty “/robots.txt” file or don’t use one)

To reject all robots from a fraction of the server

User-agent: *

Disallow: /cgi-bin/

Disallow: /tmp/

Disallow: /junk/

To leave out a single robot

User-agent: BadBot

Disallow: /

To permit a single robot

User-agent: Google

Disallow:

User-agent: *

Disallow: /

To reject all files besides one

Because there is no “Allow” field, this becomes a little difficult. The simplest method is to put all files that should be disallowed into a different directory that is named as “stuff” for instance, and leave the other file in the level on top of this directory:

User-agent: *

Disallow: /~joe/stuff/

On the other hand, you can also plainly reject all disallowed pages:

User-agent: *

Disallow: /~joe/junk.html

Disallow: /~joe/foo.html

Disallow: /~joe/bar.html

Google Personalized Search Engine Optimization (SEO) ?

Its interesting to see that Google always changes the way it operates. Time to time search engine results vary and recently I observed another change (possibly I might be late).  I’m also not sure how to name it, but seems like “Personalized SEO”. I would say this would bring more and more benefits to the firm if handled and used carefully.

My blog is always ranked high in Google Search Engine for important keywords, drives me several visitors per day. Whenever I update my blog the post quickly get scrolled and showed in the top position in the Google ranking. For some keywords I wondered how it got ranked up. Later I identified Google has linked the Gmail and SEO together. If the person in your Gmail contact searches for a keyword which is relating to your post / site then the respective post / site comes up in the ranking. Lets try an example and see.

Step one – I logged in to my Gmail from other account, which has already my main Gmail contacts saved.

Step Two – I had written a post about invest in Sri Lanka in this blog, and searched for the keyword “Invest in Sri Lanka”, where my blog appeared in the first page as 8th and 10th result. Please see the screen shot:


Then I logged out from Gmail and again searched for the relevant keyword, where I couldn’t see the same results. So I’m not sure about my guess on these, however I can see that Google tempting people to use Gmail more and giving more personalized search results.

When I see this I got an idea, it’s good for Sri Lankan firms in the following ways:

1-    Use Gmail for communication with customers (Specially travel firms)

2-    When they reply it will be saved as contact

3-    At the same time they should share their online presence such as website, blogs etc etc. Both should be killer content.

4-    Now when they search for relevant keywords since the firms are in the contact list with shared content the relevant content will appear in the first page, which might surprise them and create credibility.

(But using Gmail for customer contact may not represent professionalism since people may worry about the trust).

These are just my observation and the ideas are totally with predictions. If I’m wrong please correct me. 🙂

Managing reputation online

Now it has become a practice for the people to search for a brand which they want to purchase on offline and online.  Whenever I want to purchase a certain item I always refer Google to get  the exact specs and the  brands that are being presently marketed.  Recently when I wanted  to purchased a laptop I searched a number of Sri Lanka sites to get the names of the companies who are marketing the particular brand which I wanted to buy.  This helped me find the company which marketed the exact brand I preferred to purchase and I must say I really succeeded in buying one to my satisfaction.

In Sri Lanka online business is not a big hassle because only few companies provide online service.  Also I would  like to point out that very rarely customers in our country search for information o n online.  In order to promote a brand it is always important to get the opinion of the customers regarding the product which they have purchased online or offline.  Although it is difficult for an organization to have a track on offline purchases word of mouth,  it is easy to have a track on online buzz about the brand.  It is possible to get the views of the customers through online but there is also danger in same as well.  For e.g. if there are more negative results posted by unsatisfied customers regarding a particular brand, the company tends to lose 70% of the business made online.  In order to overcome this, the company should have a proper strategy to manage the online reputation.

As a marketing manager of a company you will be happy to view positive comments regarding the products marketed by your company, but at the same time you will be very much disappointed when you see negative results in Google search results.  If you come across a situation like this, you can implement the following tactics in order to overcome this problem and in the long term this will give you good results.

1 – If you have a site try creating some of the subdomain to get listed in the search results. Make sure you do some link building activities in order to increase the popularity of the sub domains.

2 – Create other domains that  represents your brand name. Eg- If your brand name is genuine.com then think of registering other domain pointers like genuine.net, genuine.org, genuine.info and add some valuable content.

If the .com site is transactional then .info to tell something about your brand such as new releases , new features, press room, corporate information etc.

3 – Create blogs and optimize it related to your brand name. According to my experience a well written wordpress.com blog is well picked up by the search engines. Eg- genuine.wordpress.com. Add some good content and update it regularly. So once the search engines identified your blog as a useful one it will show  up  your brand keyword. You also can try the same for blogger.

After creating blogs representing your brand name now submit it to blog communities. Blogged.com, mybloglog.com, blogcatalog.com are some of the good communities where  you can submit. Make sure you create the usernames represent the brand name.

Eg – http://www.blogged.com/profile/yourbrandname

4 – Try creating social networking profile to represent your name. Facebook, Myspace, Flickr, Twitter, YouTube, Dailymotion are some of the good sites. Try to avoid using nicknames instead use your brand name.

5 – Create company profiles in Linkdin, which will also highly valued content by search engines. Don’t you have a company profile in Linkdin, What you can do is create an employee profile and then add your company name. Once one entry is added you can see a page created for your brand. In Naymz you can also claim your company identity.

6 – Create a sample site in Webnode. It is a free web site creation tool but high search engine friendly. The inbuilt options give you the way to set up site maps, optimize the site. You can have some related content in the site and have some bookmarking.

7 – If there are blog, forum sites contains your negative comments, request them nicely to remove the posts. Just explain them the situation that you are in. In general sites wont remove those because it can be a breach of independent opinion. But you can have a try.

8 – If you see some good result in the second page try to get those pages up by doing some optimization activities.

9 – Do press leases and find out good PR sites to distribute it. These will be crawled quickly and get listed.

10 – If you can create widgets use widgetbox.com to create and submit. This will be also get listed quickly in search results.

11- Do some bookmarking as well

Just do the above by having an action plan. It will get you positive results after few months and you will realize most of the negative results gone down. For your brand keywords if the search results are not competitive then your job will be more easy. You can see the results very quickly.

To Come top in the Google search results within an hour.

It is always difficult to come top in the google results position in an hour. Just worked out few techniques and  finally got the idea how it works. I heard recently Google has changed the way of crawler’s work. I have created a video to test this technique. Named the video with a title which is having 17,000,000 search results in Google and the keyword title was “new marketing trends blog”.

 

Video – http://www.youtube.com/watch?v=yPR02oDXriU see below the screen shot as well. 

11

 

Did other technique- Better to keep it secret 🙂

In an hour I saw my video had come in the first page out of 17,500,000 results. See the screen shot below. (Go to http://www.google.co.uk, then type this keyword -” new marketing trends blog”)

untitled

I think it is time for marketers to think about the use of YouTube. After aquired by Google it has become more search engine friendly. In my point of view all can optimize their videos for targeted keywords, and can get the videos on top. I also observed that  if we can upload it to more than 10 video sites and optimize, can see the results even better because if you optimize only one video channel,  in a  few days time  it will go down. So continuous optimization will help you to retain the position. 

Hot SEO trends 2009

Trying to going inline with all the changes in search engine optimization is a always a difficult task.  It’s an ever-changing industry with new trends and dimensions. SEO is still going to play a major impact on the online business. Currently I’m reading for me e-marketing award so tend to search for some trends about SEO and came across the followings from various webs. The trends are amazing anyway based on my experience I also got some points to tell.

1 – Google is going to remain the the top of the curve. According to Cnet Google conquered nearly 70% of Internet searches in year 2008. So what does this means? You all have to optimize your site for Google. It is also important to note that Google had made 400 changes in the algorithms as well.

2- Links are important. Get quality links pointing to your site. Do not forget to make your content in an exceptional quality. I think Google always monitor this because recently I have created a blog by doing continuous book marking. Within 2 weeks I got 1 PR. It remained for about 1 month and then again vanished. So make sure you get quality link without spamming. 🙂 Make your site more informative. Because recently based on research there are more “lookers” than “getters”. People search more for information than make a transaction. Even if you’re selling something, you must provide some free information that draws outside resources to your pages or you’re going to get left behind. Quality content = links = credibility = rankings.

3 – Viral and social media. I don’t care if your company sells bubble gum or $100,000 plastics injection molding equipment, you can benefit from social media. Set up an account on Twitter and Facebook and get connected. Social media (sites such as Facebook and Twitter) isn’t going anywhere. And more of your potential customers are using it to make connections. You should be, too. By being active in online communities, you can develop an audience

4- Based on the recent research I found that video optimization is also going to be a key in SEO. YouTube strategy need to be developed in order to get to the top position or organic ranking. This can be done easily I will post this as my next post. Where within 2 hours you can come to the first Google result page. My argument is if you are running a web site and if you are not getting in the top position then worst. But instead of page if your videos are in top position again you will find it good.

5 – With smartphones like the iPhone and new BlackBerry becoming wildly popular, local search is going to be huge in 2009. If your products appeal to people on the go, or if you have a local downtown shop of any kind, you need to get busy with LBL (local business listing) optimization. The next customer who drives right by your shop may have been looking for you at that very moment, and found nothing.

6- SEO now going to be the in house function. In order to cut cost firms are going to invest more on SEO internally by training and developing competent people in this area.

7. SEO Scams The downside of increased interest in SEO is that many small business owners will continue to spend money making unethical scam artists rich.

8. Personalized Search Results Personalization of search results has been simmering for a couple years now, but has started going mainstream recently. Google is leading the way with things like SearchWiki and Preferred Sites. Plus, things like your location, your recent searches, and which datacenter your search gets sent to can also impact the 10 search results you see at any given moment. It will continue to become more unusual to see the same 10 results when you and a friend in another state do the same search.

9. Local Search and Mobile Search Mobile search has been on the way for years, but it never arrived. Until now. Mobile search used to be as fun as root canal, but the growth of smartphones – fueled by the iPhone – means mobile search is more enjoyable, more productive, and more popular than ever before. If your business appeals to people who might be searching on the go, local SEO should be a high priority for you in 2009.

For some addition click here