10 Ways To Increase Traffic By Changing Your Code

3573333256_5827198517_bSearch engine optimization, or SEO, is a very complicated science. While no one knows exactly how Google ranks websites, we do have a lot of proven techniques.

Good SEO depends on several factors in your website, but when you usually think of SEO, you probably think of dealing with content and inward links. But did you know that there are several ways to improve your traffic and SEO by changing your code? Increasing traffic and rankings in code depends on three factors:

  • Semantics – It’s been debated, but it seems Google gives higher rankings to validated sites, plus there’s less chance for major errors that can cause browser rendering or download issues that could turn visitors off.
  • Download times – Users don’t leave your site because it’s taking 20 minutes to download-plus it’s easier for search engines to spider.
  • Keywords – Just like regular SEO, it’s important to get those pertinent keywords into your code for good rankings.

Let’s take a look at how you can use all three of these to easily improve your site’s code, traffic and even your SEO rankings.

1) Validation and Semantics

Validating your website and using proper semantics are important in several ways, but let’s just focus on the SEO issue. When you follow W3C standards to validate your site (I use XHTML 1.0 Strict) and you use proper semantics, like cutting down on classes and IDS, using selectors and properly naming elements – there are several traffic benefits you can get:

  • Shorter Code – Shorter code means faster download times.
  • Better ranking – You’ll already get a slight increase in rankings from a validated site, without even taking content into consideration.

2) Link Title & Alt Tags

Link title and alt tags are used for both accessibility standards and to boost your SEO through keywords. You should have a title tag on every link describing what the link is, but also adding a few keywords into it. For example, a link to FreelanceFolder’s index page could have a title of “Home of FreelanceFolder freelancing web articles and blog” or whatever keywords you’re trying to target on your site. The same goes for alt tags on images. You’re required to have them for validation (#1) anyways, so make sure you put some good keywords in them.

3) Site Title Tags

Website title tags are the tags you see in the browser tabs, or top of your browser window. These are weighed pretty heavily by Google, so make sure you have the most important keywords, as well as unique titles on each page.

4) Replace Flash with jQuery

Flash websites are pretty much ignored by Google. Unless you’re a big name company who doesn’t need rankings to get traffic, this can be a death sentence. However, if you replace your flash with jQuery, not only will your content now be read by search engines, but ALL the content inside your jQuery will count, whether the user can see all of it at one time or not. This is very useful for people who use jQuery tabs to switch between large blocks of content.

5) Optimize JS & CSS Files

Optimizing your files can greatly reduce your download times and server load. There are several free optimizers on the web, like JSO. Your JS should also be broken up to multiple files, so each page isn’t loading a bunch of script it isn’t using.

6) Link to Google’s jQuery Library

This also reduces load on your server, making for faster download times. Google’s jQuery library is also optimized and smaller than the normal jQuery library.

7) Fix Broken Links & JS Errors

Not having someone’s browser attempt to load files and scripts that aren’t there speeds up their download times.

8) Header Tags

As important as website title tags, header tags are also a great place to add keywords. Headers should be used sequentially. Therefor, your <h1> tag should probably be the website name or section name, <h2> should be the page name, and <h3> should be a subheading or subsection name.

9) Optimize Your PHP

There are several ways to optimize PHP specifically to speed up download times:

  • Variables – Declare variables outside of the loop, so its declared once and not repeated.
  • Functions – Pass variables by reference to functions, instead of passing copies.
  • File pointers – Always close these!

10) Use htaccess Files

Use htaccess files to change the names of your pages (especially if you’re using PHP variables like I do) to rename pages. Include important keywords separated by dashes. You can also use them to redirect users to another page, in case they bookmarked an old page that no longer exists.

SEO is Not Just Keywords

Good SEO depends on a variety of things, not just keywords and links. While changing your code won’t really help a site with little or bad content, it can be a great boost to a site or blog just starting out.

Have there been things you’ve changed in your code that’s increased your traffic? What did you do?

Photo courtesy of Burning Image

How To Get More Clients

Get More Clients

Tired of struggling every month to find new clients?

Join us for our latest workshop and build your own custom marketing plan. Conrad Feagin - the Chief Executive Freelancer at FreelanceFolder - will guide you step-by-step.

The workshop includes live classes, expert support and one-on-one coaching.
Learn more here.

Comments

  1. says

    Great article! Some thoughts…

    What happens if you link to Google’s jQuery file, and their server is down? ;) I usually recommend people host as much as possible on their own site, so their page load doesn’t hang when something external is unavailable.

    Also, a good read from Google on SEO and keywords:

    “Google does not use the keywords meta tag in web ranking”
    http://googlewebmastercentral.blogspot.com/2009/09/google-does-not-use-keywords-meta-tag.html

  2. says

    @sriganesh Thanks!

    @Squaregirl I think it’s an extremely small chance that Google’s server will ever go down. You have more of a chance of your own server going down, especially if it’s bogged down. Also, Google does not use meta keywords like you said, which is why I didn’t add them to the blog post, but they still use keywords on the site in title tags, link and alt tags like I mentioned above.

    @Phil Why don’t you pass parameters into functions? I’ve always heard it speeds up loading time.

  3. says

    There are a lot of comments in here that are outdated beliefs about SEO that I would like to set the record straight on.

    “You’ll already get a slight increase in rankings from a validated site, without even taking content into consideration.”

    If you watch this video you’ll see that around 1:30 Google’s spokesperson very clearly says there is no boost whatsoever for having a validated website:
    http://www.youtube.com/watch?v=FPBACTS-tyg

    “Flash websites are pretty much ignored by Google.”

    According to google’s blog they do read flash content these days: http://googlewebmastercentral.blogspot.com/2008/06/improved-flash-indexing.html

    A better statement would be “Poorly designed flash sites are invisible to Google” if you convert text to shapes in flash, then yes, your site will become invisible to Google.

    Don’t get me wrong, most of this info is useful. Another common misconception about SEO is that the most important factors are on-sight optimization, and that simply is not true. On-site optimization only helps Google know what your website is about. If you want to rank higher, more important factors are the quantity of in-bound links and the nature of the websites that are linking to you.

  4. says

    Hi Amber,

    Nice post, but there’s an inaccuracy in the info on rankings being affected by valid semantic code.

    Google have stated previously that this doesn’t affect rankings whatsoever. I’ll give you an example which has been cited to me before.

    “If a college professor decides to put up a webpage on a new discovery which isn’t semantically correct (lets say he generates his HTML from MS Word) – should that document be weighted / penalised because his HTML isnt valid?”

    However cleaner code means that search engine spiders are likely to digest more of your content more frequently and more easily.

  5. says

    @SEO Consultant The boost in rankings from validated sites comes from the fact the site is quicker, easier to load as it doesn’t contain any errors that spiders have to jump through. Validated sites also normally contain better content & keywords from cleaner code. Guess I should’ve clarified that :)

    Personally I’ve never seen an entire flash site that had high rankings (excluding big name companies, etc), so I’m not sure how well Google’s new flash ranking works (of course I haven’t seen every flash site…just my experience).

    You are right, the most important factors are all off-site; backlinking, etc. I thought it would be helpful to show some ways to improve html coding all around that would also improve SEO at the same time.

    Thanks!

  6. says

    Yeah these are all great ways to improve your websites ranking in Google. Flash based websites although they look fancy really are a waste if you don’t get seen in Google. I’ve been using flash as a part of my website design rather than building the whole website in flash, but I’d like to look more in to JQuery.

  7. says

    We now use Xhtml 1.0 Transitional and CSS 2.1 in our designs.

    I’ve had a few of our sites rank really well after a month of being launched.

    I’ve recoded a few sites that had layers of html inclusions, table layout, and images that where huge, but shrunk to size with code.

    These sites are loading faster and seem to gain better results too.

    I can’t find the “official” Google comment about loading speed. Anyone have that link?

  8. says

    @Mike: “Flash based websites although they look fancy really are a waste if you don’t get seen in Google.”

    They also don’t work on mobile devices!

    @Amber: “You have more of a chance of your own server going down, especially if it’s bogged down.”

    But wouldn’t your whole site be down then? ;)

  9. says

    Great SEO tips. Was just going to say that, oddly enough, I actually found the jQuery off the Google server was slower than pulling from my server. So, I recommend everyone do their own speed tests over a period of time to see the result.

    Second, I didn’t see any difference in the Google version of the jQuery file than mine. They’re both minified.

  10. says

    Why does everyone assume we all use JQuery? There are so many greate javascript frameworks out there just because Jquery became popular lately it doesn’t mean its the only one (for start check how many more are hosted on google). Especially the part that says about replacing Flash it should rather say replace Flash animations with a javascript equivalent. Its like calling IPhone all mobile phones.

    As for the google hosting ajax libraries idea.. its a double-edged knife. Basically on one hand to speed up your site everyone recommends to keep all the files on the same domain to avoid extra DNS checks. On the other hand the argument with these libraries is that if everyone starts linking to them on google, then the lilelyhood is that a user visiting your site also visited another site in the past that used the same library off google and therefore will reuse it from its cache speeding up your site.

  11. says

    I disagree with number 5. A large portion of the loading time is simply the HTTP request. Breaking the JS and CSS files into portions for each page causes more overall HTTP requests, meaning slower load times. If you have all your CSS/JS in a single CSS/JS file, the first page may be loaded slightly slower, but the rest of the page loads will be quicker since you can take out an extra HTTP request.

  12. says

    @Theron you break the js into multiple pages so you aren’t loading unused js on every page. From my experience, it significantly speeds up loading times, especially if you’re using a lot of js. If you’re using the same js on every page, then you are right, it’s speedier to have it in one page.

  13. says

    I’m a big fan of using heading tags correctly. It amazes me how many sites I find that have h2 tags before or without h1 tags. Using heading tags consistently will indicate to search engines and people the value of the content to follow.

  14. says

    @Amber: I guess it depends on the situation. If you have a 13kb JS file used on only one page, load it only on that one page. However, if you have 0.5kb on each page, you might as well clump it all together and save the HTTP requests.

  15. says

    A great interesting article on SEO here, thanks for the tips. Search engine optimisation is complex and there are so many different methods to helping a site rank better that it can become a bit of a minefield to do successfully.

    It is always good to see a comprehensive list like this of factors that really do work. Although there are some obvious ones on the list such as Flash not being SEO friendly, and sites needing validation, other ones such as optmising PHP and JS are less obvious and well known.

    I’ll be taking note of these and referencing them with the range of optimisation techniques we use to ensure sites are ranked highly, and are as validated and well built as possible. It will interesting to read the feedback below on other peoples SEO approaches.

  16. dan says

    i try to avoid leaving negative comments at all costs, but, if you follow the advice in the above article you’ll spend a lot of time & put a lot of work into things that will reap no real benefit for you at all.

  17. says

    ” your tag should probably be the website name” I don’t think this is right. Why would you have the most important heading on the page the website name ?

    The section or category is a better bet!

  18. says

    Great article, though there isn’t much proof that Google is more likely to rank sites higher that are semantic, there are different factors within the code that makes the site more likely to rank higher.

    There are a few things that you missed out here that you should always do with your website, and I always make sure I do them.

    Have a good link anatomy:
    There are far too many sites out there relying on the title tag and content when some of the juice of your site is on the URL of the site. For example, many people just leave the traditional wordpress permalinks on, but you’re better off to have some keywords in there too and don’t miss out on them! You don’t necessarily have to put stuff in sub-folders etc, you can use the .htaccess file to rewrite the URL for you:

    “Options +FollowSymlinks
    RewriteEngine on
    RewriteRule ^home(/)?$ /index.htm [L]”

    Remember, [L] only goes on the last line. You can find out more information about this here: http://www.seomoz.org/blog/seo-cheat-sheet-anatomy-of-a-url.

    One error a lot of people walk into with this is missing images/css files. The reason for this is because the .htaccess rewrite tricks your server into thinking the site is located elsewhere, so just stick an absolute URL in place of all of the relative URLs on your pages:

    ” changes to

    (excuse the lack of code, you all know that rel etc is needed).

    Well written, valid sitemap:
    Whenever you create a site, you should always create a sitemap with the intention of submitting it to both Google and Bing Webmaster central. Your best bet is to head over to http://www.sitemaps.org/ for information on creating this. For dynamic websites, such as WordPress there are usually plugins that do this automatically for you, these will save you a lot of time and hassle. Remember to check back on your sitemap once it’s submitted to make sure it is written correctly and there are no errors. They are very easy to fix and Google will explain what your errors are and how to fix them.

    http://www.yourdomain.com/
    1.0
    never
    YYYY-MM-DD

    Some people may not consider this “on-site”, but it is!!! Any SEO work that you do that doesn’t involve gaining valuable links etc and is uploaded with the site, ie no external work, is on-site! So never forget this!

    Valid & Working Pages:
    Though this is on-site, this will involve a little waiting and some analysis through the webmasters tool. In Webmasters, click your URL and then Diagnostics > Crawl Errors. In here you will see any 404s that you should fix with a simple Redirect in your .htaccess file:

    “Redirect 301 /404-link/ /working-link/”

    This has got to be the easiest line of code that will ever be in your .htaccess file and it’s worth going through all of those 404s to get your link structure correct!

    The canonical URL tag:
    Very few people use or are aware of the canonical URL tag. It’s purpose is to tell search engines that all pages should be considered as one page. To stop any ‘overtaking’, as it were, in the SERPs.

    “”

    For a full explanation of when, how and where to use the tag, visit SEOMoz: http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps

    Simplify:
    If your site has a complex link architecture, then it’s unlikely that it’s going to rank very well OR be very user friendly. What you want is a site that is both easy to use by real people and by bots crawling the site.

    A great way to simplify the structure of your site is by minimising content and using hash-tags rather than separate pages! It’s absolutely fine to have a services page with everyone on it, rather than twenty different pages with all of your services on! Just use the hash tag URLs to link to these, and put them all on one page!

    Sorry for the essay, I just don’t want people to just stick to these 10 points. There are many more things that need to be done as well as these. If anybody needs any help with this, don’t hesitate to get in touch and I’d be happy to lend a hand!

  19. says

    My previous comment doesn’t seem to be appearing and I spent quite a while writing it:

    Great article, though there isn’t much proof that Google is more likely to rank sites higher that are semantic, there are different factors within the code that makes the site more likely to rank higher.

    There are a few things that you missed out here that you should always do with your website, and I always make sure I do them.

    Have a good link anatomy:
    There are far too many sites out there relying on the title tag and content when some of the juice of your site is on the URL of the site. For example, many people just leave the traditional wordpress permalinks on, but you’re better off to have some keywords in there too and don’t miss out on them! You don’t necessarily have to put stuff in sub-folders etc, you can use the .htaccess file to rewrite the URL for you:

    “Options +FollowSymlinks
    RewriteEngine on
    RewriteRule ^home(/)?$ /index.htm [L]”

    Remember, [L] only goes on the last line. You can find out more information about this here: http://www.seomoz.org/blog/seo-cheat-sheet-anatomy-of-a-url.

    One error a lot of people walk into with this is missing images/css files. The reason for this is because the .htaccess rewrite tricks your server into thinking the site is located elsewhere, so just stick an absolute URL in place of all of the relative URLs on your pages:

    ” changes to

    (excuse the lack of code, you all know that rel etc is needed).

    Well written, valid sitemap:
    Whenever you create a site, you should always create a sitemap with the intention of submitting it to both Google and Bing Webmaster central. Your best bet is to head over to http://www.sitemaps.org/ for information on creating this. For dynamic websites, such as WordPress there are usually plugins that do this automatically for you, these will save you a lot of time and hassle. Remember to check back on your sitemap once it’s submitted to make sure it is written correctly and there are no errors. They are very easy to fix and Google will explain what your errors are and how to fix them.

    http://www.yourdomain.com/
    1.0
    never
    YYYY-MM-DD

    Some people may not consider this “on-site”, but it is!!! Any SEO work that you do that doesn’t involve gaining valuable links etc and is uploaded with the site, ie no external work, is on-site! So never forget this!

    Valid & Working Pages:
    Though this is on-site, this will involve a little waiting and some analysis through the webmasters tool. In Webmasters, click your URL and then Diagnostics > Crawl Errors. In here you will see any 404s that you should fix with a simple Redirect in your .htaccess file:

    “Redirect 301 /404-link/ /working-link/”

    This has got to be the easiest line of code that will ever be in your .htaccess file and it’s worth going through all of those 404s to get your link structure correct!

    The canonical URL tag:
    Very few people use or are aware of the canonical URL tag. It’s purpose is to tell search engines that all pages should be considered as one page. To stop any ‘overtaking’, as it were, in the SERPs.

    “”

    For a full explanation of when, how and where to use the tag, visit SEOMoz: http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps

    Simplify:
    If your site has a complex link architecture, then it’s unlikely that it’s going to rank very well OR be very user friendly. What you want is a site that is both easy to use by real people and by bots crawling the site.

    A great way to simplify the structure of your site is by minimising content and using hash-tags rather than separate pages! It’s absolutely fine to have a services page with everyone on it, rather than twenty different pages with all of your services on! Just use the hash tag URLs to link to these, and put them all on one page!

    Sorry for the essay, I just don’t want people to just stick to these 10 points. There are many more things that need to be done as well as these. If anybody needs any help with this, don’t hesitate to get in touch and I’d be happy to lend a hand!

  20. says

    @ Dan All of this is basic on-site SEO and has personally take my site from a 2,000,000 Alexa rank to a 200,00 rank in less than 3 months. How was that a waste of time?

    @gipi you’re technically supposed to put the most important information first in an h tag, which would be the site name, then the page name would be in an h2 tag. If you take a look at a prebuilt system like wordpress, you’ll see they do this as well.

    @khopdi a flash site will be indexed by google and show up in search results, but the content will most likely no be read by google, therefor you’ll have no on site SEO

  21. says

    Just to the admins/mods on here, I’ve submitted a 700 word comment on here and it hasn’t been approved yet. It should help out users who have read the article and want some more information :-). So if you could get that approved ASAP that would be cool.

    Note, it contained several links and some code examples, so it may have been killed by akismet.

  22. says

    Excellent list. Many people worry too much about off site optimization and marketing. This is helpful but to get the most out of your website and get a lot of web traffic you have to optimize your website on-site as well by doing these things.

  23. says

    @WebTraffic

    I couldn’t disagree less. To get the most out of your website you need links – it’s the currency that we SEOs deal in, and without it our sites get nowhere. On-site and off-site optimisation go hand-in-hand, but without off-site there would be no point in doing the on-site work and vice versa, though people should ALWAYS spend as much time as possible on off-site optimisation. You need links from sites with good authority, then you’re going to get noticed.

  24. says

    Found this to be, somewhat, short on writing, but very well informative as it brought some useful information into mind & triple thoughts on creating sites. Along with its content & other ‘website’ related nature.

    All in all, amazing post, especially for an early morning read. Makes for a great head start on ‘thinking & starting off right’. Thanks!

    - MexiChriS

  25. says

    I have no idea that these can make a big difference for our blogs. These are the little things we should have a look before doing anything else.

  26. OIS says

    5) Optimize JS & CSS Files
    This is wrong. You should combine and compress your js and css files into one each, and put them on a different (sub)domain. Browsers have about 2-4 max open connections to a website at a time. You should include a last modified timestamp in the filename, and make the files cacheable (set expire a few months into the future).

    9) Optimize Your PHP
    * Variables – Declare variables outside of the loop, so its declared once and not repeated.
    – Always process the parts which will be the same for each iteration of the loop before the loop into variables. Good logic > “minor optimisations”.
    * Functions – Pass variables by reference to functions, instead of passing copies.
    – No. As others have explained.
    * File pointers – Always close these!
    – All filepointers are closed automagically. You only have to close them if the files will be written to by other processes. Usually you dont use a filepointer anyway, just read the file into a variable or use readfile.

    10) Use htaccess Files
    htaccess files are “slow”. Apache have to check each folder if there is a .htaccess file. Then read it. For each request.
    If you have access you should rather edit all into one conf file for the [virtual] server.
    You should rather have the page names in a db (maybe cached if you get lots of requests) or a php config file $config = array_merge_recursively(require(‘config.common.php’), require(‘config.server.php’));

  27. says

    @OIS I’m sorry but it’s not wrong. If you have multiple js functions on one page o every page, it does need to be on one page. If you have function a on 1 page and function b and c on a 2nd page, they should be split into 2 js files and called only on that page, so the browser isn’t loading unused js on ever page. There have been plenty of speed tests done on this.

  28. says

    SEO isn’t as easy as you posted here. What you post here is only small percentage of search engine ranking factor. However i like the title and content you name here which seem to provoke user to visit, comment and of course gain or increase your traffic.

  29. LautaroAlberto says

    Well.. one thing about my own philosophy on web matters is that.. nothing is written …

    Despite the fact taht indeed some of the advices are outdated, i give the credit to the author, and will keep up digging around on every subject that was discussed.

    In the other hand, we all have our own methods, what we think is right and wrong shouldnt take place in a discussion like this.. instead of “thinking” do a better research and make sure u are doing it right.. this is computer business.. there for its 99% math based, and everything has a reason/source so go find it and then post it here if u want to be heard and not flamed on missconceptions

  30. says

    SEO is way more complex but at least you covered some baiscs. Now Google is using stars in thier searches and with a new PR update around the corner. How will this affect our SERPS and PR? On top of that I heard that Google executives were saying the the PR is overrated.

Trackbacks

  1. 10 Ways To Increase Traffic By Changing Your Code…

    Good SEO depends on several factors in your website, but when you usually think of SEO, you probably think of dealing with content and inward links. But did you know that there are several ways to improve your traffic and SEO by changing your code?…

  2. [...] The most important part to remember is staying sane throughout the process. Working 18 hours per day for a week straight is a good way to send yourself straight into burnout mode. Find out what it takes to keep yourself sane and indulge yourself. That may mean keeping your daily walks with your dogs, or finding a way to spend time with your spouse. Now it’s your turn. How do you deal with too much work? Top image by dylanroscover About the author: Tim Wasson a freelance website design and developer based in Peoria, Illinois. He serves clients large and small through his company, TJ Dub Web Design. You can see his personal site at timwasson.com, or follow him on twitter @timwasson. ← Previous Post Random PostNext Post → [...]

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>