Thu 26 Feb 2009, 16:30 PM | Posted by admin|
Tags: Web Design & Development
Back in October 2004 I launched a series of articles outlining the ten crucial steps to a well optimized website. The steps were :
Thu 26 Feb 2009, 16:23 PM | Posted by admin
10.The Extras (all those things that didn’t fit in the first 9 steps)
Well in case you’ve been asleep for the last few years on in case you’ve just recently joined us in the SEO-realm, I – along with some of my good friends in the web marketing world – have decided to re-write the series with new information and new perspectives.
The New Series
In our updated series we’ll be dropping some of the articles and adding others to account for changes in the industry. Another major change in this series is that we’re going to compliment it with a weekly segment on Webmaster Radio’s Webcology (http://www.webmasterradio.fm/Search-Engine-Optimization/Webcology/) on Thursday afternoon at 2PM EST where we’ll be conducting interviews and discussing tools with their manufacturers to help our readers and listeners make the most of this information. If you miss the show, you can always download the podcast free of charge afterwards.
The 10 steps covered in this series will be :
1.Keyword Research & Selection
10.Keeping It Up
Step One: Keyword Research & Selection
There are two times in a site’s life when keyword research is conducted – when researching a site to rank in the organic results on the search engines and when researching keywords for a PPC campaign. In our article today we’re going to focus on the former and save the research involved with PPC campaigns for step seven in this series.
So we’ve got the topic down to “just” keyword research and selection for organic SEO campaigns – from there the topic once again gets split into a variety of areas. Those that we will cover here are :
The raw data
Studying those who’ve gone before
Understanding your choices
The Raw Data
The raw data is the raw estimated searches/day that you can expect a phrase to get on the major search engines. There are a number of tools you can use to compile this information. Here are some of the more commonly used :
Overture Keyword Suggestion Tool (http://inventory.overture.com/)
Yahoo!’s keyword suggestion tool. It’s fast and it’s free but it has some serious drawbacks. The tool often mixes singular, plural and common misspellings into one so it could lead you astray (admittedly it’s gotten much better lately but still far from perfect).
Is a bed and breakfast in Banff, BC better to target “banff accommodation” or “banff accommodations”. How about the very common misspelling “Banff accomodations”? That said, it’s based on easily the largest pool of search data made available in this way which gives it a huge edge in accuracy based on the pool of data it’s collecting from.
WordTracker is easily one of the most popular of the paid keyword research tools. It solves the problem with the singular vs plural vs misspellings however the data it accesses is from a few meta engines and is not as comprehensive as one might like.
They offer a free trial and have options to pay for just a day or up to a year so they provide options for people who simply need it for a quick round of research on one site to SEO firms who need it on a daily bases. It sells for $59/mth.
Keyword Discovery (http://www.beanstalk-inc.com/resources/recommended/keyword-discovery.htm)
This tool is very similar to WordTracker in the advantages/disadvantages category. Better specification of keywords, lower pool of data to base them on. I personally prefer Keyword Discovery simply for some of the features and the ability to export data for clients to view easily. Of course, that could well be due to my increased experience with it.
They have a free trial as well and it sells for $69/mth.
Aaron Wall’s Summary (http://www.seobook.com/archives/001013.shtml)
Noted above are some of the most popular tools and the ones I’ve used the most. There are some other tools definitely worth taking a peek at. Aaron Wall did a great summary on his site of the major tools, their pros and cons, etc. Definitely worth taking a peek at. Admittedly it’s a couple years old so some of the features have changed a bit but most of it is still valid and accurate.
Now What ...
Now that we’ve looked at the tools, let’s take a look at what we’re supposed to do with them. As noted, we'll cover how to use these tools when launching or updating a PPC campaign in a future article, however there are still a few areas and considerations that we need to consider here.
So let's get started ...
In case no one told you – size doesn't matter. It's not how big it is, it's who's using it. Let's use as an example a phrase we at Beanstalk targetted and that's the phrases “search engine positioning”. At first this was our big phrase which now gets 7,689 estimated searches/mth (a bit higher than it was back then). “search engine positioning services” gets a lowly 2,636 searches/mth. Of course we should be targeting the one with the higher number of searches (or so I thought).
Once we have attained top 3 rankings for both I started looking through my stats and setting up filters for conversion (forms filled out and visits to our contact page). People who entered “search engine positioning” were sure interested in our blog and articles but only those who added the word “services” contacted us. And so the big phrase was abandoned as a target and we began focusing on what I refer to as “buy phrases”. So bigger isn't better if the people you want are searching using phrases with a lower search volume.
There's another time when bigger isn't better. Which of those two phrases do you suppose we ranked for first? If you guessed the services phrase then you're right. When you launch a new website (which we had) you're likely up against sites that have been around for a while, have some solid backlinks and a good number of pages. You're not going to want to go up against them for the top phrases out of the gate. Choosing to go with phrases that are lower in search volume and lower in competition will almost always result in higher rankings faster, put some money back in your pocket and ready you to go for the bigger phrases.
It's here that the model we followed works well. When you're selecting your short term and long term targets it's wise to choose phrases with the same root keywords (“search engine positioning” and “search engine positioning services” for example). This basically enables you to work towards your long term goals during link building for your short term targets. And who doesn't like to kill two birds with one stone? Or perhaps you have all the time in the world and you're one of those people who likes nothing more than working on developing incoming links.
Which brings us to …
Studying Those Who’ve Gone Before
Imitation is the sincerest form of flattery. Let's just hang onto that thought while we research what those who are successful in your industry are targeting in order to glean some insight into what works.
I've recently discovered (much to my pleasure) a very cool tool that, while a bit pricey for some, simplified MANY of the processes of keyword research, tracking and competitor keyword dissection. A company called AdGooroo (http://www.adgooroo.com/) has created what I've now discovered to be an awesome keyword tracking tool (I'd call it keyword research but it does a lot more than list off search phrases). The tool allows you to do the generic keyword research that we're all used to with the same limitations as the tools above (i.e. Google doesn't hand out their search keyphrase volumes) but that's just the first step.
They then take a look at your competitors on both the organic and PPC results, figure out what they're ranking for and bidding on and provides some great reports on saturation levels, competition levels, and a lot more. With this in hand you can then begin to analyze how they're ranking (that'll be covered next week in our article on competition analysis).
The folks at AdGooroo also store historical information so you can look back over trends in the past and compare that to what you see now. As noted, a bit pricey for some but worth it for those who can afford to know this level of information on who's doing what and what you should be doing.
I should also note that I’m experienced in their SEM Insight product which costs $399/mth. They also do offer AdGooroo Express which has a lot of the same feature (but missing a lot of the ones I personally feel can give a researcher a HUGE jump on their competitors). The Express version however sells at $89/mth so far more affordable for some. And like all my favorite tools, they provide a free trial. :)
But if you can't afford that level of information you'll want to run ranking reports on all your top competitors (you likely know who these are but if you don't – they're the ones who rank in the top 10 for the most competitive phrases). You can either do this manually or use a reporting tool such as WebPosition Gold (http://www.beanstalk-inc.com/resources/recommended/webposition.htm - again, has a free trail).
If you find weaker sites ranking for large numbers of phrases, you know who to watch (again, we'll get into this more next week). The only problem with this method is that you can only think of what you can think of. The site might be ranking for phrases you never thought to look into and which, in knowing, might provide some great insight into additional targets and tactics. Of course, you might well be from an industry with very obvious and defined keywords.
Understanding Your Choices
So now you've got choices to make. You've got a list of perhaps hundreds of keywords and you need to shorten that list down. The number of phrase you target will only be limited by your site and the amount of time you have to dedicate to it.
You will likely need to pare down your choices to those that will produce the fastest and highest ROI possible. This will likely be the phrases that provide the lowest competition levels for the highest searched “buy phrases”. Once you have attained these rankings you can move on.
The alternative is to go for the gold and target the biggest phrases in your industry. This will take longer (99% of the time) but might be necessary if there are no suitable secondary phrases. In this event you have to ready yourself for a slow rise to the top and a longer period of stagnant traffic with a big return (hopefully) at the end.
Another major choice you'll have to make (especially if you have a large number of potential phrases) is whether to start out with a PPC campaign for the traffic or to test keyword phrases for an organic promotion. While these will be covered in more detail in part 7, if you just can't wait you can find a past article on the subject titled “Using PPC To Maximize Your Search Engine Positioning ROI” (http://www.beanstalk-inc.com/articles/ppc/4roi.htm).
More Info On This Series
As noted but worth mentioning again, this article series is being supplemented with a weekly show on WebmasterRadio.fm. Be sure to tune in or download the podcast to get the full information and hear some great interviews with the tool makers and experts.
Next week the topic will be competition analysis and will be written by StepForth, Inc. author and owner Ross Dunn. Ross will of course be on the show with us next Thursday along with some great guests.
Tags: Web Design & Development
Writing insecure code is easy. Everybody does it. Sometimes we do it accidentally because we don’t realize that the security issue exists, and sometimes we do it on purpose because we suspect the bad guys won’t notice one little vulnerability. Secure programming is often overlooked because of ignorance, time constraints, or any number of other factors. Since security isn’t flashy until something goes wrong, it is often easy put it off.
Thu 26 Feb 2009, 15:47 PM | Posted by admin
Once your application is compromised, you will realize there’s nothing more important. The best case scenario is that you lose days of productivity and suffer downtime while you fix what was damaged. The worst case scenario &em; your data is compromised and you have no idea if it is correct, much less what the hackers managed to copy and read. Did you expose usernames and passwords to the world? Did you happen to release the credit card information for thousands into the den of identity thieves? You’ll never really be able to know. It’s best to practice secure programming so you never need to ask yourself these questions.
With this in mind, let’s examine three different classes of secure programming "no-noes," storage risks, system risks, and exposure risks and discuss how we can prevent each of them. Server configuration and data transmission security are beyond the scope of this article, but the reader should be aware that they also play a major role in securing a web application.
Storage risks are those risks involved in the storing data and interacting with a database server or file system. The most widely known of these in the infamous SQL injection attack. SQL injection is when you allow the user to input data into a query, and instead of a value he adds his own SQL into the query. The easiest way to prevent this type of attack is to escape every user variable that could touch your queries. Luckily, PHP has several build in functions for handling this, such as mysql_escape_string(). Essentially, this works by escaping characters in a string that could conceivably be used to terminate your query and run a user specified query.
When should you escape user data? It all depends on who you talk to. Some programmers prefer to escape as soon as it enters the application, while others prefer to wait until just before it is placed into the query. Personally, I prefer to escape right before it is inserted into the query. I do this because I can always look at the code, see the database interaction, and see that the data was escaped before it was being used. I don’t need to search the entire source to make sure something was escaped.
The second storage risk we’ll talk about is storing passwords as plain text (hereafter referred to as clear text). I know you guys do it; I’ve seen too many open source applications and too many in-house applications to believe that it doesn’t go on. Simply put, there is never any reason to store a password in clear text. It doesn’t matter if you’re storing the password in a database or a flat file, always store passwords as a hash. You can accomplish this simply enough by using PHP’s md5() function to transform the password before you insert it into your storage medium. Since md5 is repeatable, you can validate a password by simply using
When should you transform the password to a hash? You should do it as soon as possible. Don’t let the password variable float around your application at all. As soon as you grab the password input, convert it into a hash. I prefer to do this by setting the password variable to its own hash, this avoids the chance of using the wrong variable in later code.
Next, let’s talk about the usernames and passwords your program needs in order to interact with other applications (like database servers). You should always separate these out into a different PHP file than the rest of your code, and reference them as constants or variables. This not only makes your code easier to maintain (if you need to change a password, you know exactly where to look), it the event that your source gets released, you know that the password isn’t in that file. While it’s certainly true that they could grab your password file, it does reduce the risk considerably.
Before we leave usernames behind, I want to touch on the concept of division of power. We’re not talking about the government in this case, but about database users. The database user accounts your program uses should have the minimum level of access they need in order to function correctly.
If your application only reads from a database, then the database account it uses should only have SELECT permission on that particular database, and no access to any other database.
To take this concept a step further, I prefer to create multiple database accounts for my web applications. Typically I create one account that only has INSERT permissions for the particular tables the software needs to write to, and a completely separate account that only has SELECT access. This makes sure that no INSERT queries are accidentally performed and mitigates the possible damage done by SQL injections.
Of course, multiple accounts work best when there’s a clear separation between those who can write to a database and those who can read it (such as a CMS). In theory, you could use multiple accounts in any application but you run into problems with the number of open connections to the database. This is simply something that should be considered as a possibility during the design phase of your software.
I’m a big advocate, as are most programmers, of breaking source code down into multiple files at every logical opportunity. However, I’ve noticed that a lot of PHP programmers have a nasty habit of naming PHP files they intend to use as libraries or other include types with the extension .inc, or .config, or some other non .php extension. This is a horrible idea because the server its running on might not be setup to parse these extensions as PHP files, so anyone loading the file would be exposing their source code (and potentially passwords, usernames, and other protected information) to the world. I prefer to prefix filenames myself, using inc_ or class_ when needed.
While we’re discussing included files, I would like to talk about to other security precautions. If you have a PHP file that you intend to use only as part of a larger PHP application, add this line to the beginning of the file (__FILE__, $_SERVER['PHP_SELF']).
This will cause the file to immediately terminate is someone tries to run it directly. A well written include or class file shouldn’t do anything when loaded on its own, but you can never be too careful &em; especially when a one line cut and paste can potentially save you so much heartache.
The other include-related item I’d like to talk about is the difference between include() and readfile(). Include will tell the server to parse the file as PHP, while readfile tells the server to output the file as straight text. You should never use include on a file that is publicly writable (for example, if you have an application that appends user submitted data to end in order to simulate a graffiti wall or guest book) or on a file that you don’t control (files on other servers, or that others can edit). A malicious user could easily inject his own PHP into your system, causing untold amounts of havoc. At the same time, you should never execute readfile on a file that ends in .php. On a misconfigured system, this runs the risk of exposing your source code to the world. To summarize, use readfile() on html, txt, and remote files. Use include on local files with php code you want to execute.
Now let’s talk about system risks. I think of system risks as those things related to the way code executes. The primary system risk in any application is invalid data. You can never valid data enough. As soon as user data enters the system, you should immediately verify it exists and that it is what you want it to be, if not your program should halt and prompt the user for better input.
When validating data, you should use the tightest filter possible. For example, if your program is expecting a percentage, you should not simply verify that they entered something. Your program should verify that it is numeric and between 0 and 100.
You should also validate at every level. Every time a function accepts input, verify that the data is what you expected it to be and react accordingly if the data is bad. This will make it more likely that you will catch bad data due a programming oversight, it also has the added advantage of catching logic errors in your software.
Next, I’d like to talk about eval(), exec(), and their ilk (shell_exec(),system(), passthru(), and pcntl-exec()). Visit their respective php pages to find out more about them, but in actuality there is very rarely any reason to use them. Eval will run any php code passed to it as a variable. This is inherently dangerous because you no longer have absolute control over what code is executed. If you must use eval(), don’t ever run it with a variable that has been derived from a user determined value, otherwise you run the risk of a hacker injecting his code. Exec() and the like pose similar threats, allowing your script to interact with the command line is a level of power you should rarely, if ever, need.
Finally, let’s talk about a couple of exposure risks. Usually, you don’t want to show your error messages to the world. For one, they freak people out. Secondly, they give hackers a wealth of information about potential bugs in your code. On production systems, always turn your error reporting off and use PHP’s errorlog() function instead.
The last risk we’ll talk about is using session IDs. Simply put, try not to ever send the session id to the user. Sessions aren’t secure, but if you transmit the session ID you run an even greater risk of someone other than the expected user to act as a "man in the middle" (to steal an analogy) and piggy-back off of the legitimate user’s session. An example of this would be using a session id to hijack someone’s shopping cart and change a delivery address, get credit card information, or do something even more malicious depending on the system.
We’ve discussed many security risks involved with programming in PHP, but they boil down to a few simple concepts.
* Never trust the user &em; don’t let them run code on your sever and always validate any data they send you.
* Don’t give the user, or your software, any level of access greater than the absolute minimum needed to successfully accomplish their tasks.
* Don’t tell the user more than they need to know &em; don’t let them see your code, the session id, or any error messages that you didn’t create specifically for them,.
Tags: Web Design & Development
Welcome to part two in this ten part SEO series. The ten parts of the SEO process we will be covering are :
Thu 26 Feb 2009, 15:43 PM | Posted by admin
Keyword Research & Selection
Keeping It Up
What is a Competitor Analysis?
Have you ever wondered how a particular competitor always does so much better than you do in the search engines or online overall? A competitor analysis is one very effective method of deconstructing their online marketing strategy to discover how they are doing so well.
What Exactly Can a Competitor Analysis Reveal?
This is a very common question because many site owners don't know the lengths that a competitor may have gone to obtain top rankings. The following examples are some of the discoveries I have uncovered in a typical competitor analysis :
By examining a competitor's link structure I have found that many of the links with the most credibility came from websites the competitor actually owned. (Determining the ownership of the domain names required some sleuthing because the whois information was 'private' but ultimately the info became available.) In a couple of cases several of these domains had legitimate websites and this prompted some great ideas for my client to attain more traffic.
While researching a competitor I noticed that although the competitor's website was very similar to my client's, there was one major difference; the competitor's website structure was far better optimized. By outlining the structure the competitor used and improving on it with my own expertise our client had the information he needed to apply changes to his own site.
In another instance I provided a client the list of all the pay per click keywords and organic keywords that each competitor was currently using. The client was flabbergasted when she realized just how many keywords she had missed promoting for her own comparable services.
The Basics of Conducting Your Own Competitor Analysis
Now that you have seen some examples of what can be gleaned from a competitor analysis you might want to conduct one of your own. For the purpose of this tutorial I am assuming that you are fairly new to SEO so I created a basic plan that works for most users; but even this will require a little preparative reading. The following is a list of essential reading material :
Is Your Website Search Engine Friendly? Your Personal Checklist
This article will show you what elements of a website make it search engine friendly and this will help you see which optimizations your competitor used.
The 10 Minute Search Engine Optimization (Updated in 2007 from 2004)
A refresher on the proper elements of optimization.
15 Shades of Spam
Jim Hedger wrote this back in 2006 to increase awareness of the types of Spam that can get webmasters into hot water. It is an excellent refresher to help you identify them.
Many more free SEO tutorials are available if you find yourself needing more information. The following is an outline of the most revealing steps with the least amount of technical expertise required. Please keep in mind that the objective of this competitor analysis is to compare what you find to your own website later on. What you find may not seem earth shattering (or it might) but this analysis is meant to show you what you might be missing :
Grab a piece of paper and a pen and while you walk through your competitor's website look for any particularly obvious search engine optimization techniques. Here are some elements you should check :
Does the title tag appear well written and if so is there a common syntax used throughout the website?
Look at the source code of the home page and search for "H1", "H2" or "H3". Do any of these tags show up? If so that means the competitor is using heading tags within the page. Now try identifying the text they used in the heading. Likely you will find the competitor's Keyphrase is found within the tag.
Check if the navigation is search engine friendly. Sometimes the navigation is a drop-down menu; make sure it is a type that is search engine friendly. If not, check the footer of the page and see if a text menu is placed there.
Keep an eye out for a pattern of keywords being used in text links. Certain words are likely to appear more often and these are likely some of the target phrases your competitor has decided to focus on.
Look for nofollow tags. No follow tags are often used to channel Page Rank efficiently throughout a website. This is called a themed structure and it can have incredible ranking benefits. If you see a pattern of nofollow tag use then you can be relatively certain your competitor has/had a well-informed SEO firm on hire.
While you roam through the site look for pages that have particularly high Google PageRank and try to identify why. In most cases these pages have information that visitors decided to link to. Perhaps this will give you some ideas for creating similar quality content for your website.
Check the site for the presence of an XML sitemap. Usually it will reside at the root of the website so try typing in the basic URL of the competitor's website and add (minus the quotes) "\sitemap.xml" on the end. The details within the sitemap might be a little confusing to you but just acknowledging that the competitor has one is noteworthy.
Have you found any incidences of spam throughout the site? Take note, I have lost count how many competitors succeeded using shady tactics. This doesn't mean you copy them, however, but it may at least give you yet another indication of what helped the competitor attain rankings. Believe me, in most cases these sites will get caught with their hands in the cookie jar at which point you won't want to be associated with the same tactics.
I can't possibly list everything you need to keep an eye out for when walking through a competitor's website; at least not in an article format. Just keep an eye out for anything that looks particularly purposeful in the site and linking structure as well as the content of the website. If you find something you can't be sure is worth noting, then try researching it online; chances are someone has written about the topic/concept or can provide you advice in a forum.
This portion of the analysis will require that you use one of the following link analysis tools: OptiLink (not free but my first choice) or Backlink Analyzer from SEO Book (free). In each case these tools have excellent help files that I suggest reading in order to get the best results from the data they generate.
In this particular stage you are going to use your new tool to analyze the first 1000 backlinks of your competitor's domain.
Program Setup Note: Be certain to set up the software to acquire Google Rank and Alexa Rank information for each backlink and filter out any rel=nofollow links. The setting is easily found on the front of both applications with the exception of the rel=nofollow which is an option in Optilink but automatically checked in Backlink Analyzer.
When the report is completed sort the backlinks by both PageRank and then Alexa Rank; examine each sorting separately.
Why Are Both PageRank and Alexa Rank Used?
The reason both are used is because they each have notable disadvantages and advantages. PageRank is notoriously unreliable especially lately since Google now penalizes the PageRank of any site with any relation to link buying. As a result, sites with low PR could be missed as a quality site. Furthermore Alexa Rank is a decent indicator of a site's popularity but I can't rely on it since it is not an established indicator of how well a site is regarded in Google. Between the two stats, however, we can glean a good indication of the sites that have the best reputation for link building.
Creating a List of Authority Competitor Backlinks
Using Excel or another spreadsheet application copy and paste the data you received from OptiLink or Backlink Analyzer into a worksheet. Then create a copy of the sheet so that you have an exact copy of all the data on a single sheet. Now follow these steps :
On the first worksheet sort the data by Google PageRank (PR) from highest numbers to lowest. Now remove all of the pages that had less than a PageRank of 4 so you are left with the best sites according to this data. OR just separate the lower PageRanked sites so they don't get in the way.
On the second worksheet sort the data first by Alexa Ranking (sort lowest to highest numbers) and then do a secondary sort by the Google PageRank (highest to lowest numbers). Delete or remove all sites that have a negative Alexa Ranking ("nm" is how it shows in OptiLink) or otherwise partition them from your other more valuable data.
Now you have two excellent worksheets that provide lists of authority pages that have links pointing to your competitor.
How to Use the Backlink Data
Take some time now to filter the links by domain and you will see just how many links per domain each competitor has. If you see a website that appears to be linking to a website a lot it is usually because either the competitor owns the website or has purchased a link on the website. To find out if your competitor owns the website try running a Whois on the domain.
Also check the content of the link data for how many pages listed are from the competitors own website. If you see a great deal from their own website then you can be relatively assured they have good content which is important to note; perhaps you need to focus on better content on your own website OR how to get others to notice your good content.
Now the most logical step is to figure out which links are worth getting for yourself. Chances are a decent number of the links you found are from pages that would be willing to link to you as well.
Don't Lose Focus on Your Own Website
So now you have a few tools to conduct a cursory competitor analysis. You will likely find some very useful data that you can act on but is this all you need to do? Is a competitor analysis going to be the golden key to increased profits? No. I have a great deal of faith in competitor analysis because I know determining what a competitor is doing successfully can improve a marketing plan dramatically. That said, you also have to pay close attention to your own website and the quality information that can be gained from using free tools like Google Analytics or handy paid tools like ClickTracks Professional.
Using a quality analytics program will allow you to get as granular as monitoring the success of each page in your website with details such as: where did visitors come from (somewhere in your site or from another?), how long on average visitors stayed at a particular page, what keywords led visitors to the page (if any), and much more.
With proper analytics you can actually compare and contrast the effects of minor edits to a page's content; this is called multivariate testing. For example you can run tests to see if you can improve the retention of visitors by adding a better image or a better tag line because you noticed that many visitors were entering at a page deep within your site that was not originally designed as an entry page.
Truly, the sky is the limit with analytics and it would be irresponsible for me to state that competitor analysis is more important than making your own website run smoothly. Do yourself a favour, if you haven't already got an analytics program running on your site, get it done now or learn how to use the one you have; it will pay off in the long run. Especially when you want to monitor the success of the tactics you applied to your site from your competitor analysis findings.
Tags: Web Design & Development
The key to a successful Website is content. Content alone will drive visitors to your site, and compel your prospects or customers to form a valuable opinion about your company. With over 7 million Websites being added every month, how will your site stand out? The way to get your Web site noticed by search engines and then to get those prospects to stay and keep coming back comes from the content on your Website alone. There are no two ways around this fact.
Thu 26 Feb 2009, 15:40 PM | Posted by admin
Enterprise Website owners have entire teams in place that can manually update and refresh content on a regular basis; small Website owners simply don't. In a world where information changes rapidly, Websites need to be constantly updated and monitored. This is extremely crucial as users - and even search engines such as Google - pass over stale Websites. Manual configuration and updating of Websites takes time and can be messy too, involving integration with multiple systems like campaign management or CRM. With a skimpy IT staff and shrinking budgets, most small Website owners find it challenging to ensure that their sites reflect their dynamic business needs.
Web Content Management Systems (CMS) – a must-have for small Websites
Content Management Systems (CMS) have emerged to address these issues. In simple words, a CMS is a tool that eases creation, maintenance and management of the content a Website. The biggest advantage is that it allows non-technical Subject Matter Experts (SMEs) to manage different aspects of a Website such as the general content as well as the navigation, page layout and links without any knowledge of programming tools.
Web content management is as necessary for small companies as it is for large organizations. Small Websites may not have mass popularity, but they have a simple advantage over large Websites: they are more focused and targeted. But still, that committed user community, though small in size, presents significant challenges for the small Website owner. Adding news items, making design changes or launching new products via separate Web pages is not a simple affair.
A CMS gives small Website owners the power to easily modify and change text, design and layout of their Website through well defined templates. It also becomes simple to quickly insert, delete or update anything from images galleries, forms, tables, styles or formatted lists.
Traditionally, most small Website owners have relied on independent software consultants or vendors to keep their sites updated. However, as our experience shows us, multiple small changes conveyed for text or design is frustrating and wasteful. A CMS can ease these difficulties, providing users with a simple and non-technical way of managing the content and not the technology.
Key benefits include :
• Improved speed and ease of publishing content
Small Website owners can accelerate content publishing by giving SMEs tools that are simple to use. Content specialists can focus more on creating content and not worrying about publishing, approval processes and formatting changes--so the site still has the same look and feel.
• Maintaining consistency and link integrity
A content management system helps organizations maintain consistency across all pages of their Website (with style sheets, templates, etc.) so that branding and design are controlled to the level desired, regardless of who is responsible for the actual content. As a result, visitors have a consistent and professional experience. The CMS even helps to maintain link integrity –significantly reducing the chance of users reporting a missing link. This is critical, as a missing link in the Internet world means a missed business opportunity.
• Full control
Features like complete system auditing and reporting provide organizations the ability to manage and track history of all work, facilitating regulatory compliance. Files can be given a full document lifecycle, including check-in, check-out, versioning, rollback, approvals, and scheduling. A CMS has intelligent workflow automation, ensuring that content passes through appropriate quality gates before being published. Additionally, completely configurable workflows enable organizations to assign tasks to any person, with provisions to escalate in case defined thresholds are crossed. For example, e-mail alerts can be sent to content owners of specific sections on a Website, if these sections are not updated after a specific time period. This is difficult to do in a manual system.
In a digital world where content can creatively be used in a variety of forms such as whitepapers, podcasts or articles, effectively managing and using this content is critical for competitive advantage. A CMS can help in a huge way, by centralizing and streamlining the process of managing and publishing content. This centralized control medium also means that an organization, no matter what size, can effectively measure the success of various online marketing initiatives.
The Software-as-a-Service (SaaS) angle
While a CMS helps business users manage content more effectively, organizations must also understand that content management systems can be expensive to procure, complex to implement and configure, and even more difficult to maintain.
To address these issues, organizations can consider using a CMS delivered as a Software-as-a-Service (SaaS). With this software model, organizations are spared the high initial cost of purchasing the license. Also since the software is hosted, so there is no hardware to buy, no software to install and no infrastructure to manage. As a customer, an organization just pays on a fixed monthly or quarterly basis and leaves the task of managing, maintaining and upgrading the software to the vendor.
Organizations also save costs, as they do not have to budget for a developer who tweaks HTML code, or a Webmaster who actually takes care of hosting. Players such as CrownPeak even provide a dedicated account manager, as part of their Lifelong Active Support. By using a SaaS model, organizations can also minimize risk and choose different functionalities as they grow. Further, as billing is on a monthly or quarterly basis, costs are spread across the lifetime of a product’s usage. This is an extremely attractive value proposition when compared to the traditional software model, where costs are paid upfront and the risk of product implementation and adoption is totally on the customer.
If you have an online presence, you need a CMS
The Internet is a great equalizer – and a user will not forgive an organization because of its size, if it has missing, incomplete or old content on its Website. In a scenario where millions of Websites jostle for attention simultaneously, a small Website owner needs the guiding hand of a CMS solution to increase its value for a niche audience. The difference between being seen as a pearl or as flotsam on the internet is too great – it is up to the small Website owner to make the right choices.
Tags: Web Design & Development
This article takes a look at the top 10 web design tips for users at any level. It will give you 10 complete different points to contemplate when you next begin a site design or when talking to your web design consultant/employee.
Thu 26 Feb 2009, 15:39 PM | Posted by admin
Design is never straight forward and web design has the additional unpredictable complication of technology thrown in. This means that you need to consider the consequences of your design decisions and how it will effect the most important people who see your site, the users themselves. The following tips should help you consider this and have a positive effect on your site and its users.
1. Navigation & Functionality
You should never sacrifice overall functionality for artistic extravagance. It is highly unlikely your site will ever achieve its purpose if the people who visit it cannot clearly and easily navigate around it.
Your site should look good but first and foremost consider how someone who knows nothing about the site would think when they landed there.
Something occurring in website frequently these days is Mystery Meat Navigation. This is a term coined by Vincent Flanders and it is used to describe site where navigation structures are so obscure and difficult to process that users cannot identify them at all and end up running there mouse across whole sections of a screen just to identify hyperlinks.
People say images are worth a 1000 words and in web design that’s true in 2 ways. Firstly an image can do a lot more than text in some situation but secondly they are much, much bigger files with a higher download time.
It is widely accepted users will click away from a page that takes longer than 5-10 seconds to load and every time you put an image in a page you are increasing the likelihood of this happening. Additionally each image you imbed into a page design activates an additional HTTP request to your server so dividing an image into smaller ones or using lots of small images across a page does not solve the problem.
ALT tags should also be factored into the code of a website. They are a huge help to people who have either images turned off in a browse, mobile broswers that can’t read the images or a random error preventing the image from showing. They also hold a small SEO benefit.
It is advised that you use CSS and not tables to format a document but in some cases tables can be necessary. Remember one thing however, a table cannot be displayed until it has fully loaded. This can potentially cause a huge problem for users as they wait for the page to load, nothing appears then out of nowhere the whole page is done.
Someone is much more likely to click away when nothing is loading than when they can see progress.
Don’t design sites to use fonts only you have, chances are they will be converted into some dull font and ruin the effect you were trying to achieve. Save special fonts for specific headers and convert them to images. Make the rest of your site in standard fonts so that as many browsers as possible will see it in the way you meant it to be. Recommended fonts for high scale compatibility are Arial, Verdana, Courier, Tahoma and Helvetica.
Plug-ins hold a lot of potential for both users and designers but it can easily be misused and misguided.
Plug-ins have a many forms and uses, the most popular being Java and Flash Player. I have heard a lot of people say that these plug-ins are “safe” and that everyone has them but this is simply not true. Every plug-in has a stack of different versions and connects differently depending on the browser the user is surfing with.
Think if your users will really want to browse to other sites to download a plug-in, restart the browser then navigate back. If the answer is no use other tricks at your disposal to make your page unique and save the big guns that are Flash and Java for times where it is essential.
The “tags” I am referring to hear are meta keywords and description, title, alt and h1 tags. Together these tags help manage your sites search engine optimisation (SEO) potential and this is defiantly something not to overlook. Helping people find your site will bring more traffic in and more conversion if you are a retail site.
The higher search engines rank you the more traffic will filter down and the more successful your site will be. Try to keep a constant theme running through all your tags but do so in a subtle way. Splashing the same word 1000 times on your page will only have negative effects so make sure you strike the correct balance between informative and spammy.
In a perfect world everyone would use the same browser and your website would look the same on everyone’s screen but unfortunately this is not the case. Every browser has its own specific functions and styles and learning to make you code cooperate with both can present some serious problems.
The three you really need to concern yourself with are Internet Explorer, Firefox and Safari. These make up a good 95% of the browser market at the very least and while there are some additional popular browser I would not recommend you spend time optimising code for the rest.
The only thing you can do is do your best and stay away from browser specific functions, you’ll never make someone get a new browser.
8. Pop Ups
Something that is very important to remember is that the user should be in control of their browser and desktop. Do not place unnecessary pop ups and window opening links everywhere and the user will feel they have lost control of the site, become annoyed and close the windows.
There are some exceptions to using the (_blank) attribute but I would steer clear whenever possible.
9. Text Layout
Text is part of your design to and positioning it correctly on the page is very important. Try to get all the copy you need as early in the design process as possible. This means you will be able to design around the copy instead of trying to cram it into smaller spaces because someone wrote twice as much copy as they said they were going to.
Use the right alignment for the right situation, remember 99% of people will prefer left align and while justified look aesthetically better it can be very difficult to read in longer bouts.
10. Site Search
In this day and age finding what you want on a website in paramount. After you have followed the first tip on navigation you should also provide a search bar on your site so that a visitor who cannot immediately see what they are looking for can search. Many internet studies have seen the success of these smaller additions to your site and free ones are available from Google and many other SE operators.
Taking these 10 tips into account will help you design a more user-friendly and successful website, sometime it may seem like they are hindering your grand design but failing to take notice may result in your site being a very pretty stop sign for browsers. Just because you can find your way around your Flash menu system that takes 6 minutes to load doesn’t mean Mrs Smith who needs the product can.
Tags: Web Design & Development
After you've opened your website and purchased unique domain, you want to be found by the potential users. Since the web space contains millions of websites , there is a constant struggle between websites over the first places in search engine results for competitive words.
Thu 26 Feb 2009, 15:35 PM | Posted by admin
Getting to the highest places in search results of Google or Yahoo is a very long and tricky process involving loads of work on your website and outside it. In this article I'll list some information about how Search Engines determine relevancy, a short preview on search engine optimization.
First of all, let's review two most important algorithms Google uses to determine page relevancy – Page Rank and Hilltop.
The basic concept of Page rank is : the more incoming links your page has, the more important it is. Page ranking is given per page and not per website so don’t be surprised if your website shows different ranking on different pages. Page Rank is calculated as follows :
PR(A) = log10((1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))) when :
Tags: Web Design & Development
Why care about the Page rank?
Thu 26 Feb 2009, 15:32 PM | Posted by admin
As mentioned in previous articles, Page Rank is one of the familiar variables in determining appearance in SE results. Although today the importance of page rank has been descended, and sometimes we see pages of PR 3 or 4 getting to first places in Google, Page Rank is one of the variables SEO's often deal with because this is the variable we know a lot about and can approach sensibly. The second good reason to learn about PR is using it to emphasize important pages of your website, like the Homepage ( you know how annoying it is when SE presents some negligible page of your website higher than the Homepage)
What is PR?
PR ranges from 1-10 and indicates the number and quality of page incoming links.
Page Rank is calculated as follows :
PR(A) = log10((1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))) when :
PR(T1) – Page rank of page T1 linking to page A. T1, ….,Tn
C(T1) – number of outbound links in page T
Notice that the 1-10 scale is accepted because of logarithmization of the equation. Therefore , each group of page rank ( PR1,… PR10) is not equal in size and getting to higher PR group is getting harder with every step.
This article deals with page rank maximization techniques :
PR0 – Page rank zero
This rank is usually given to :
pages with very few incoming links
websites being caught using Black Hat Strategies to promote their website.
Webpages with PR0 can harm your website in case that one of your pages links to PR0 page. Although, if PR0 page links to your page , your pages is not hurt nor benefited because Google assumes you don't have absolute control over your incoming links. Even so , you may exchange links with PR0 page in case that
The PR0 page doesn't contain hidden text or other forbidden elements of black hat SEO
The page is new and therefore gets zero ranking.
Although PR depends on your incoming links, linking to other websites can cause you PR leakage, therefore use only necessary links to other websites.
Linking to pages inside your website does not cost you PR therefore you can do it freely.
Make sure every page in your website links to other important pages since every link has its contribution. Leaving Dead end pages wastes this contribution.
Use PR to emphasize important pages :
Creating links smartly to most important pages of your website, like Homepage, gives it PR boost and helps emphasizing its importance in front of less important pages in the website.
This is the most popular and beneficial link scheme emphasizing your homepage :
All pages in the website link to Homepage (first priority page)
Page of x priority is not linked to other pages of x priority but only to Homepage, page of x+1 , and X-1 priority. (for example, categories are not linked between them).
No jumping between pages in non hierarchic order : for example Page priority 1 doesn’t link to page priority 3 , and page of same priority are not interlinked.
If we count the relative distribution of PR between all the pages, in this scheme, Homepage gets the highest rank.
If we link the categories , Homepage rank descends and category PR raises a little.
If we link all the pages at the website to each other, all of them get the same but low PR.
To get the highest Homepage PR we must link all the pages to Homepage only and Homepage must link to all the pages.
Exchanging/ Purchasing Links
Incoming link is more effective if it comes from high PR page.
Page with few links is better contributor than page with lots of links (in case they are of same PR)
Exchange links only with well ranked websites because exchanging links cause PR leakage. This leakage must be weaker than link contribution otherwise the exchange is not beneficial.
Don't exchange pages with PR0 pages because in best case scenario it doesn’t help you. (Unless you expect this page to be very popular in short period).
Tags: Web Design & Development
Index is a website which contains details and links to millions of websites divided to categories and topics. Indexes can have multiple purposes.
Thu 26 Feb 2009, 15:30 PM | Posted by admin
Local Business Index – like Yellow pages. This kind of indexes apply to people who are searching for services in their town or area. It is a good idea to sign your company into few such indexes in case your business is locally oriented like flower shop or electrician. Inclusion in these indexes , at least the most popular of them , is usually quite expensive but also useful from the point of view of customers seeking local service and search engines, since you can include your site URL and earn a strong incoming link.
Global Business Index – like Hoovers.com or business.com. These indexes usually contain data about large companies with turnover of few millions of dollars a year. Not everyone is able to enter such index, although some of them allow small companies listing in exchange for reciprocal link or inclusion fee . Inclusion in these indexes is important to large companies who apply to customers worldwide , and they also give a good incoming link in terms of SEO.
The third type of indexes is Internet indexes which main purpose is to provide incoming links to websites. Usually these indexes are manually edited which means it can take months before the editor approves your link. Some indexes like yahoo require inclusion fee , some are free, like DMOZ and some require fee or reciprocal link. Website promoters largely use indexes to provide the new built websites with incoming links, although , not all indexes are usefull and important in terms of link strength and website promotion. Therefore I suggest to submit your website to as much as possible free indexes but be careful before paying for listing or exchanging links with nameless indexes.
In this article I will list some of the most important indexes to be included in.
General Guidelines to submitting your website to index :
Make sure your website has no pages "Under construction" or "Not exist" (Error 404) , since some indexes might disqualify your website because of that.
Wright down a short profile of your website which includes :
Site Name , Full URL, 10-15 key words and phrases, short description (200-500) characters. Site name should not include many keywords, it might cause rejection by the editor.
Open an excel file dedicated to indexes and directories where you keep all the passwords and usernames since you will probably need to come back to these websites few times to follow up your registration.
Place your website in the most suitable category, in order to improve your chances to be indexed and later to improve the impact of the indexing on your page listing in search engines.
Indexing takes a lot of time because index editors are volunteers and the demand is huge, so be patient and do not submit your website more than once , since it makes you look dummy and may cause your website be rejected.
DMOZ Open Directory Project
This is the most important and appreciated index on the web. It is edited by volunteers , and has more than half million categories and around five million websites.
The main importance of DMOZ is its influence on major search engines like Google , Yahoo and etc. which derive information from DMOZ and ascribe large importance to links incoming from there to your website.
To include your website in DMOZ go to www.dmoz.org , find the most appropriate category to your website, press Add URL and follow the orders. Take few minutes to read DMOZ terms of submission in order not to mess up your application.
DMOZ will reject your website If it is placed under wrong category, mainly under construction, has no value to public or is poorly described. Description must be objective otherwise most chances it will be excluded.
DMOZ will not notify you about reception or rejection. If you can't find your website, don't hurry to submit it again , try asking at DMOZ forums what is the status of your application.
Entering DMOZ index is the only way to enter Google website directory.
Yahoo is the most popular website in the world and US. Inclusion in its index might be worth the huge fee of 299$. Wright a short description (10-25 words ) , make it objective and fluent , include 2-3 keywords but do not exaggerate.
Make sure your website is in a good condition (no 404 pages and under construction pages) Yahoo is not condemned to include your website even if you pay the fee , but they are supposed to review your listing in about 7 workdays. In yahoo index you don’t need to pick a category , one of the editors does it. Must say, most of the websites are included. Business sites must pay the fee every year while non-profit sites must pay only once.
Tags: Web Design & Development
Reading a website is not similar to reading a book or a newspaper, therefore before you start spilling money on campaigns be sure your site is built in accordance with internet writing standards. Lets talk about few simple rules you don't want to break when building a website :
Thu 26 Feb 2009, 15:21 PM | Posted by admin
Don't waste your visitors time
Time is money. Most people decide whether they are interested in what the website has to offer in just to seconds. Therefore make sure your landing page provides the most crucial and appealing information about your business.
Number one time waster is a heavy loading pages with lot's of flash animations and pictures. Make sure your web pages are not too heavy otherwise you might loose your visitors before they entered your website.
Most of the visitors need to find what they are looking for fast, otherwise they look for it elsewhere. As a webmaster you must make sure your data is well categorizes and divided to relevant pages , categories, headlines and sub-titles. A visitor must be able to navigate your website easily and successfully.
Avoid too much "Special affects"
By this I mean don't include too much flash movies and avoid using music if you don't have to. As we mentioned earlier, Flash has a large loading time which damages viewer experience, besides too much flash may be annoying and search engines can't read it.
Music is a great thing, but somehow music and websites do not fit too well. If you want to put some background tune anyway, make sure a user can silence it quickly and easily. Or in contrary , let the user click "Play" and begin the music when he decides.
Pop-ups and new pages
If you have links in your site , I strongly advise you not to use the pop-up option and not to open these links in new window. Seemingly pop-ups and new windows are a very good idea if we don't want the visitor to navigate from our website but many years of experience shows that users prefer to click "Back" buttons. As to pop-ups , most of browsers tend to block them , and you don't want to waste your viewer time on enabling pop-ups.
Use standard fonts which are supported by all browsers, like Arial and Times New Roman. You don't want your viewers to get gibberish and go away from your website. Don't use very small or very large fonts . You don't want your pages to smear for miles and you don't want to scare away people with poor vision.
Tags: Web Design & Development
Color is a very important aspect of design and website in general. Picking the wrong colors may repel your costumers in subconscious level and make your website and business look amateurish and unreliable. Many things have been said about colors and what they represent. In this article I will explain some basic rules in choosing website color.
Colors are connected to certain associations, this is something to keep in mind. Red is a color associated with danger and it is very intensive, therefore you shouldn't make it the dominant color if you want to create calm, reliable atmosphere. Pink is a very feminine color so you better use it in websites related to children and women. Blue is the most popular color when it comes to website design because it represents calmness and reliability. White is the most popular color for a background, although many websites use black and dark backgrounds.
The problem with black color is its intensiveness and that it is sometimes represents darkness and evil. Combination of black background with very bright colors may result in very unbalanced ,really bad looking website. Although, there are some cases when designer can create quite a good looking website despite the demand for black background. If you considering making your website black I suggest you take a very good designer because black emphasize all the other objects of the website, specially if they are bright.
Combining colors in your website can be a hard task. There are number of free websites which may help you matching website colors, like http://www.colormatch.dk/ . I suggest you don't use more than three or four colors in your website so it won't look amateurish. If you want to add a picture to the header , select the website colors from the picture
Be reasonable in color choosing. If you own a beauty shop , don't color your website gray or black. If you have a law firm, don't make a pink or orange website.
Be aware that some people are color blind , and some people have vision problems so make sure to have enough contrast between the text and the background so everyone will be able to distinguish between them.
Caution , Copyright! Finding beautiful images is very easy today, all you have to do is enter Google Image or one of the large image banks. Unfortunately for the website owners, most of the pictures are copyright protected, and using them can cause you thousands of dollars. You may think , since your website is small and relatively anonymous no one will find out. You should not count on it , today, technology allows finding content on the net very easily, so I suggest you don't take risks, pay for your pictures or use the pictures from your Website Building Provider.
» Crash Story
» Chronology of Jyoti Basu's life
» Jyoti Basu: Marxist who almost became India's PM Died Today
» Annular Solar Eclipse 15th January, 2010
» My Name is Khan a love story: SRK
» Terror suspect Headley not our agent, says CIA
» Chiranjeevi resigns, explains new "united Andhra" passion
» India-Lanka ODI shifted to Nagpur
» Italian PM Berlusconi struck in face, bloodied
» Telangana: Govt gives in, KCR ends fast
» India rout Lanka, become World No 1 Test side
» At least 101 killed in Russian nightclub blast
» Has India changed after 26/11?
» Islam - the lawful (Halaal) and the prohibited (Haram)
» 4 MNS MLAs suspended for attacking Abu Azmi
» 40 hours on, Jaipur's oil depot still ablaze
» Fire at oil depot in Jaipur; 6 killed, 150 injured
» Andhra CM YSR Reddy, 4 others killed in crash
» ISRO launches virtual globe-mapping â€˜Bhuvanâ€™
» A chat with Dr.Devi Shetty
» Tribute to Michael Jackson - 'King of Pop'
» Homage to Kamala Suraya
» Hamas, The Best Chance for Peace in the Land of the Philistines
» PANADOL : Beware
» Is the Bull Market Real, or Will it Fizzle?
» IPL Plans Longer Strategy Breaks This Season
» Women Education in Modern India
» Higher Education in India
» New Income Tax Forms for 2009-10 AY Notified
» You Can Vote "None of the Above"
» Elections 2009: India Awakening
» All ready to vote for change - but for whom and what?
» Throwing a shoe â€“ and reminding us about crimes 25 years ago
» Shah Rukh and Aamir: Khan they?
» Who will be the next PM of India ?
» Humble and Contrite Thieves: Goldman Sachs wants to pay back 25% and walk away
» DIFFERENT KIND OF INVESTMENTS
» The Seven Mistakes All Novice Traders Make and How to Correct Them
» Israelâ€™s Death Culture, a Dragging Anchor
» Black Money in Swiss Bank
» Decay of Spiritual Values in India
» Youth of India's lack of interest in politics
» How to help children avoid becoming addicted to television
» Hobbies for Busy Women
» How to prepare for exams ?
» Exams for kids, fever for parents
» Mumbai celebrates Vada-Pav day today
» Microsoft to Launch Windows Mobile 7 Next Year
» Google uses 1000 machines
» Exercise At Work
» January 2010
» December 2009
» November 2009
» October 2009
» September 2009
» August 2009
» July 2009
» June 2009
» April 2009
» March 2009
» February 2009
Art & Culture
Foods & Beverages
Health & Fitness
Tours & Travels
Web Design & Development
Get the blog's RSS feed
About RSS Feeds
Sphere: Related Content