Adverts

Archives By Subject

Calendar

Mon Tue Wed Thu Fri Sat Sun
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31            

Search

RSS


Tags

adobe apache book review cfimage coldfusion google google chrome hosting iis internet explorer java javascript jquery lucene photoshop regex ses urls software review sql injection svn trac

Subscribe

Enter your email address to subscribe to this blog.

URL Re-writing Best-Practice

Introduction

A while back, I wrote this article about how Google recommended NOT using URL re-writing.

The Google article provoked a lot of debate and certainly managed to infuriate a lot of SEO experts.

Google's advice seemed pretty conclusive after that: don't use URL re-writing. However, Google released their updated SEO guide later, and the waters muddied. Page 8 had the following on URLs:

If your URL contains relevant words, this provides users and search engines with more information about the page than an ID or oddly named parameter would

Now this isn't directly saying that Google indexes the words in URLs. The example is showing the URL being displayed in search results for a user to see, so it might be misconstrued. It does however re-open the argument on whether we should add keywords in URLs for SEO, and that inevitably leads to URL re-writing.

I've a number of greenfield projects starting soon, and I've been mulling over whether to use re-writing again. The unresolved issue of Google indexing words in URLs will obviously play an important role in any decision, but the usability argument is possibly more strong. There doesn't seem to be much in the way of research or best-practices, so I've written this article to start a discussion on it.

As I'll explain below, I'd like things to be as evidence-based as possible, with links to relevant authorities. Most of the tips are my own thoughts on the matter, but I've tried to back them up with reasoning.

I welcome comments with evidence or reasoning, whether they back up or argue against what I've written. That way, we can put together a strong set of guidelines which will hopefully benefit everyone.

[More]

Browser Bugs in IIS Logs

I spent several hours today wading through IIS access logs. Not the most fun part to my job, but sometimes necessary. During the course of it though, I found a subtle difference with the way clients (browsers) pass the information that goes into these logs. Hopefully it will be of use to someone else.

[More]

How to tweak IIS to improve your coldfusion sites' performance

Following on from yesterday's review of High Performance Web Sites, I had a quick go with YSlow, to see where I could optimise this site.

The starting point was a respectable score of C(71) for the home-page. The first rule that didn't score well was number 2: Use a Content Delivery Network. A CDN is over-kill for a blog, so that left me looking at the following rules 3 (Add Expires Header), 4 (Gzip components) and 13 (Configure Etags).

[More]

Book Review: High Performance Web Sites

When a website is running slow, the traditional response has been to look at optimising the application code. In fact, 90% of the response time is taken downloading and rendering the page and it's constituent elements. In the book High Performance Web Sites, Steve Souders describes 14 rules for improving front-end performance.

[More]

Adobe Certified Professional - Macromedia ColdFusion MX 7 Advanced Developer Powered By ColdFusion 8 aggregated by mxna aggregated by coldfusionBloggers