It's Called SEO and You Should Try Some

Nov 18, 2009

It’s been a while since I last posted, but this bears note. Search engine optimization, commonly called SEO, is all about getting search engines to notice you and people to come to your site. The important thing about good SEO is that it will do more than simply get eyes on your site, but it will get the RIGHT eyes on your site. People typically misunderstand the value of optimizing their site or they think that it will radically alter the layout, message or other core elements they hold dear.

First, what SEO isn’t. I think it’s best to get this out of the way early so we can get into helping you do good stuff without a bunch of “but-but-buts.” So, SEO isn’t cramming a bunch of keywords into the bottom of your page. It also isn’t redesigning your entire site so it looks like garbage but Google can read it like a dream. SEO is not putting your site on every link farm in the world and it is not spamming people on social networking sites. SEO is also not spamming people on message boards. SEO is not about fads and fast grabs. It’s not about people coming to your site and then bouncing immediately. SEO isn’t about a bad web experience, plain and simple.

Now let’s look at what SEO is. SEO is about strategic placement of key concepts in your web site to encourage traffic that will be interested in your message, product or service. SEO is about making the best of what you have to offer and making your web presence work for you. SEO is about traffic analysis and evolution. SEO is about marketing in a smart way and encouraging your customers to think of you first. SEO is about becoming an industry leader and a recognized authority.

I don’t believe in mincing words or trying to sneak around and do back-room deals to become another SEO douchebag, so I felt it was only right to lay that all on the line first and foremost. Now that we have a picture of what SEO is and is not, we can benefit from looking at ways to improve your site today and give you tools to improve your site more over time.

Keywords

Keywords are important for any search. Regardless of the way that the searching is done, eventually it comes down to what the customer is searching for and keywords are precisely this in a crystalline structure.

The thing to know about keywords is they are meant to be a focus. If you are planning on making a page about anything, it would behoove you to understand the essential idea you are trying to convey. Once you have this in mind, write down three to seven keywords. Use these as a guidepost and they will keep you on target. Moreover, if you are on target and your keywords were selected properly, they will appear naturally in the text. This is good. Don’t try to overdo it. If a keyword is important it will appear in the main copy a few times. The appearance of keywords in 400-1000 words of copy should live around fewer than 10 times.

Titles

Every document should have titles. These titles are going to range from the overarching document title down to sub-sub-subtitles. This hierarchy of titles is important in SEO because it tells both your reader and the search spider what is most and least important. If you have good titles, they will work for you. If you choose poor titles, or worse, your hierarchy is haphazard, then they may well work against you. Take care to pick the right title structure for what you are actually trying to say and you will do well.

In-bound Links

Any site that is considered an authoritative source on anything is bound to have in-bound links. If people care about what you do or say, they are going to refer to you for citation. The people who created search algorithms know this and they take advantage of it. The more in-bound links your site has, the more likely it is to be an authority. Authoritative sources show up higher in search rankings than derivative sources. Keep this in mind and strive to be an authority. Pick something that you do which the rest of the world could do to know about. Push that and become a key player. This will boost your site rankings as well as being a generally good business practice.

While we are discussing links, let’s discuss directories. There are several directories on the web but, as far as I know, there is only one that is still completely human edited and maintained. That site is the Open Directory Project (http://dmoz.org). Since the Open Directory Project is human maintained, it is given more value by the search engines. It is the equivalent of having a single person, who is a noted authority, personally vouch for your site. This would be like having someone with a doctorate vouch for your research in their field. It’s mega bonus points and you should use it to your full advantage. It takes a while to get listed so don’t fret if, after you submit your site, it takes months to see a result.

Meta Information

This is where things get a little more technical. Meta information, generally referred to as meta tags, provide spiders with direct information about your site. You can add things like a description and keywords. Both of these things will help people find your site more easily. The meta keywords typically aren’t given as much weight anymore, since people abused them in the past. Your meta description is the vital one. Google, for one, uses your meta description to tell people about your site in your own words. That’s a good thing since it gives you personal control over what people see before they hit your site. Below are the tags that should be included in the head of your page:

<meta name=”description” content=”your site description goes here” /> <meta name=”keywords” content=”your keywords go here” />

Document Format

The underlying format of your document will tell spiders a lot about what they are looking at. This is one of those items that can be worked on over time. As long as spiders know you are out there, they will check back on pages from time to time to ensure they know about the latest changes. First, it is best to pull your site out of that table layout you are using. Spiders think tables mean that each piece of data is related to another in a certain way. If your entire format is a table then they will not correctly interpret the content you have on your page and you might lose brownie points.

Commonly, sites are created using page divisions, as God intended. This means that you tell the browser “this is a piece of this page and it stands on its own.” Spiders can read this much more easily and the whole site degrades much more gracefully if you do a good job. Graceful degradation makes your users happy, especially if you have users with limitations, using a special browser. Once your document is formatted properly, you can arrange your divisions into an aesthetically pleasing format using Cascading Style Sheets (CSS). CSS is outside of the scope of this discussion, so I am going to let that dog lay.

Sitemaps

I’m not talking about your run-of-the-mill sitemap for your visitors. I am talking about a carefully crafted and standards compliant XML sitemap. There is a standard used for creating a sitemap and, once you have one, it makes indexing your site a breeze. Search spiders commonly grab a sitemap so they can better understand what they should and should not index. Sitemaps also allow you to tell spiders how often certain pages are updated. This allows them to index pages that change all the time more often than pages that may not change for months at a time.

Robots.txt

The robots.txt file is a simple text file that tells search spiders what to index and what to ignore on your site. It’s similar to a sitemap, but the parameters are limited. You can say “index this” and “don’t index that.” This is great if you have some pages that are currently in development, should not be seen by the public or other strangeness like that. This also allows you to have a copy-testing site that should be ignored. Duplicate copy on sites is looked down on by search engines, so anything you can do to avoid indexed duplicate copy is a good thing. I typically have a sparse robots.txt file as most of my site is viable content, but government agencies and Rupert Murdoch seem to like robots.txt quite a bit.

.htaccess

This is probably a book in its own right, but it is something that people should be aware of. The .htaccess file allows administrators to control site access and redirect people to new pages and away from missing pages. Correct use of the .htaccess file can limit the number of broken pages that spiders will encounter and keep your visitors happy. One of the best features of a good .htaccess file is the ability to redirect users and spiders alike to new pages and send a message back to them, letting them know the redirect is permanent. Spiders like knowing that a page has moved. It makes the whole process of reindexing faster and easier. This kind of redirect is generally called a 301 redirect for the code that is returned by the browser.

Blogging

We live in a time where the blogosphere is king. What this means to the rest of the world is, blogs influence online life. Blogs change rapidly and bloggers tend to stay atop issues that are near and dear to them. Blogs are also a boon to your industry. If you have a blog that reflects knowledge and a profound understanding of your industry, you are more likely to be considered an authority. Blogs also give your site an opportunity to generate new content on a regular basis. Search spiders like new content and index sites, which are regularly updated, more frequently. This also provides an opportunity to share information about your industry in a non-business environment and generate in-bound links, you remember what I said about those, right?

Social Networking

This is closely tied to blogging, but it can impact your business and website in many ways. From forums to MySpace to Facebook to twitter and others like these, people will talk about what they do and don’t like. If you are a well liked provider, the word will get around and people will head to your site. These sites help people to find things that others have recommended and they are a great source of in-bound links. Search spiders check these sites often as content changes minute to minute. Also, if someone recommends your site on a social network, it is taken as a personal recommendation and spiders will take note.

There are so many ways to take advantage of social networking that it should probably be at least one, if not several, college courses. I did choose to list this one last, however. If you have not worked on everything else first, social networks can be your worst enemy. People will say negative things about their experience and spiders will touch your site more often only to pick up your poor SEO. Once this happens, it’s downgrade city, so watch out!

In the end, there are many facets to SEO, but most of them can be worked on and improved by anyone that helps build, update or administer your site. With this information I charge you, go forth and make the web a better place.

Information and the state of the web

Jun 9, 2009

I only post here occasionally and it has crossed my mind that I might almost be wise to just create a separate blog on my web server.  I have these thoughts and then I realize that I don’t have time to muck with that when I have good blog content to post, or perhaps it is simply laziness.  Either way, I only post when something strikes me.

Oh, and strike me it did.  Today I was hit with a brick by some guy running a site all about that crazy thing called love.  Lies!!  It was all about the web.  More importantly, it was about how to do various things on the web.  The site is called “The Noodle Incident” (http://www.thenoodleincident.com/) and, though it was not my favorite site to visit, I wasn’t terribly bugged about it.  Let me say, I wasn’t bugged until I got to the design page.  Finally, I’d had enough.

The navigation was impossible, the selection of navigation copy was mystery meat and backwards navigation was impossible.  I have to hand it to the guy, his site was clean and easy to read.  Applause deserved for that, but while he realizes that machines and all sorts of accessibility interfaces must interact with his site, he forgot that PEOPLE have to interact with the site too.  This is a really important thing to remember.

Wonderfully, this brings us to the topic du jour: Information Architecture and User Experience.  These are buzzwords right now, but they are really important buzzwords.  They represent something that people have worried about and fussed over for ages.  The question is always the same, “how do I make this easier for people to do?”  The IxDA community holds the key to this particular castle and I promise you the princess is in there.

So, where did our wily friend go wrong?  Simply put, everywhere.  Honestly, the site is easy to read as I had said before, however getting to that information is a real bear.  If you start on the front page then you are going to do well.  The main page of the site leads off to all of the information, as far as I can tell.  The real problem is navigating from another page back to main or to some other set of content.  If you landed anywhere but the front page of the site, forget about navigating anywhere without hand-editing the URL.  Long and short, accessing the content on this site is a challenge and that is bad.  This might be a good time to note, information is still king and getting to it is the only way to ensure optimal reader retention.  If your readers can’t access your content, they are going to assume you have none and leave.  It’s as simple as that.

Another big pitfall is his navigation location.  People learn to rely on the location of menus and such when visiting a site.  Optimally, you should have a strict, well though out navigation hierarchy that you adhere to in the most draconian sort of manner.  I’m not kidding.  Lop off hands of the people that defy you.  You’ll feel better come the end of the day, I promise.  You will see immediate benefit as your users learn to trust that your navigation will remain right where they saw it last as they move from page to page to page through your site.  Key thought here: if your user doesn’t notice the architecture of your site, you did a good job.

Finally, the most embarassing problem with The Noodle Incident, aside from having an uncanny resemblance to a Guns ‘n’ Roses album title, is that it suffers from Muphry’s Law.  Technically, Muphry’s Law applies to editing mistakes, but when generalized, would say “whenever you critique something, you are bound to have an error of the same type in your critique.”  This relates to something really important, be sure that your content is useful, correct and does not point out flaws in your own site.  Nothing turns a user off faster than going to a site that is supposed to be an authority only to discover that they are incapable of following their own rules.  If you post authoritative content, be sure that you really know what you are doing and double check that you aren’t going to be embarassed by it later.

The take-away from all of this is that Information Architecture, attention to the User Experience and some careful content creation will lead to a happier, more productive site.  People will enjoy visiting and may even take you seriously.  Focus on navigation, findability and accessibility.  These items, coupled with a site that is easy to read will lead to a better web experience for everyone involved.

Browser Wars

Apr 13, 2009

It’s been a while since I have posted. I know. For those of you that are checking out this blog for the first time, welcome. For those of you who have read my posts before, welcome back. We’re not here to talk about the regularity (or lack thereof) that I post with. What we are here to talk about is supporting or not supporting browsers. So first, what inspired me to write this? Well… this:

We Don’t Support IE

So, this brings a question to mind – which browsers should we choose not to support and for what reasons?

This is an easy question to answer.  You support all of them.  Yep, you heard me right.  You support everything.  You are mindful of browser incompatibilities, inequities, disabled users, mobile users and users you had never even thought of before.  You are aware of the fact that browsers come in multiple versions and you make your sites backwards compatible.  Long and short, do not tell your users what to do.

Now, a caveat to all of this must follow.  If you are creating a web site geared toward the bleeding edge crowd, you can probably inform your users that they should hop on the newest tech to get the full features of the site, but even to this end, you are to never, never, never to create a site that displays no useful information to a user that does not fit into the spectrum of your audience.

Now, before people hop on me for claiming that the We Don’t Support IE site is encouraging people to make their web sites inaccessible to all IE users, I am not saying this.  What I am saying is ignoring the IE crowd is throwing away, at the very least, 50% of your audience.  More than likely, you are going to be tossing out more like 70% of your audience.  This is a bad idea if you plan on doing anything even remotely commercial with your site.

This discussion could bear a little bit of transparency.  I do web development for a living, and I tend to spend a lot of time focused on user experience.  I mean a lot of time.  That being said, I spend quite a bit of time listening to people explain what they do and don’t like about the way that something functions.  Moreover I see a lot of really bad sites.  By this, I mean horrible, awful, not fit for use web sites.  So I am not going to just pound on the Firefox/Mozilla, Opera, Safari/Webkit/Chrome crowd.  I understand that this is the group that would rather see Internet Explorer gone, but let’s be realistic, IE is probably going to be around for quite a while yet.  Get used to it.

Like I said, though, I am not going to just pound on one camp.  You Microsoft guys get your lumps too.  See, I code in PHP, but I also code in ASP.Net and C#.  That being said, I know the dirty nastiness that lies under the hood of the MS technologies too.  I have seen sites that were built solely in ASP.Net and whatever code-behind model they chose which catered only to Internet Explorer.

Now, I understand that IE has access to neat little .Net architecture tools that other browsers don’t play so well with, but I have seen sites that were simple, straightforward websites that when viewed in IE looked great, but heaven forbid you use anything else.  Unforgiving is too gentle of a word for what I have seen.  Pages rendered completely unreadable, forms that stretch across the screen and then some.  Serious kinds of ugly.

Just to inject my personal bias so everyone can see where I come from on a user-side standpoint:  I like Firefox.  I use it a lot.  I am comfortable with it.  It makes me feel all warm and fuzzy inside.  I really detest using Internet Explorer.  I find myself limited more often than not with it.  IE has gotten better in the past year or two, but I am still not a fan.  It’s just the way it is.

So bearing my bias in mind, I have to say this: what you like, appreciate or prefer to work with does not matter.  The only one that matters is your user, and you should aim to create as close to the same experience for every user as possible that arrives at your web site.  If you have a menu that looks killer in Firefox, but can’t be created in IE no matter how you try and it is unusable for more than half of your users, scrap it.  If you are unable to tweak your CSS to make everything feel similar, research, or pick another layout.  It is that simple.

This browser fight is very reminiscent of the 90’s when everyone had a “Get Netscape” or “Get Internet Explorer” buttons plastered all over thier pages.  The web has grown, so it is time that we do too.  We cannot continue to battle this way or we will only alienate users that might otherwise be loyal customers.  In closing, we only hurt the user more by trying to force our preferences upon them.  Don’t do it.

Web Scripting and you

Oct 16, 2008

If there is one thing that I feel can be best learned from programming for the internet it’s modularity.  Programmers preach modularity through encapsulation and design models but ultimately sometimes it’s really easy to just throw in a hacky fix and be done with the whole mess.  Welcome to the “I need this fix last week” school of code updating.  Honestly, that kind of thing happens to the best of us.

Being that I am a web developer, specifically working in an interpreted language, there are two ways that things can go, clean, neat and easy to manage or a horribly mangled mess.  My first couple of full-scale projects on the web were more of the latter and less of the former.  I cobbled things together any way that worked within the time frame that I was given.  Ultimately this meant little to me at the time, but for the people that are maintaining the code now…  I am terribly sorry.  Fortunately, I know the poor sap that is currently updating the code so he has a resource to cut through the thick undergrowth of spaghetti.

Now fast forward a few projects and one ridiculously large CMS later and I have learned a few things about what not to do.  Lesson 1: don’t make a script that does everything.  Lesson 2: you are eventually going to have to look at that code again.  Lesson 3: When the code is not completely obvious (read this as print statements and built in functions being used in the simplest possible way) comments are always helpful. Lesson 4: even interpreted languages have debuggers, so use one. Lesson 5: make it modular.

Lessons 1-4 are things that everyone hears, ignores and then ultimately pays the price for.  Lesson 5 is something that is preached and never reached…  dig the rhyming scheme.  On the web, if you build something in a nice, chunked out way to begin with, your code will look like that forever more.  I promise.  Once you have built a handy little chunk like an order-processing script that just hums along and processes whatever you send it, you’ll never write a hack for this order or that one again.  I promise.  It won’t happen because you won’t need to.  You have a handy little piece of code that works like… say… an object!  WOW!  Who would have thunk it?

Now I write this not for the programmers that are in engineering teams out there working with a bunch of people that all have a standard that they follow and ultimately know all of this already.  I am writing this for the rogue programmer that has decided they are going to go it alone and do something stupid like write a custom CMS/Project Management System/Time tracker integrated tool… Man, that sounds really familiar.  Anyway, if you are going to tackle a large project all by your lonesome it is of the utmost importance that you make it as easy for yourself as possible.  I really like that I have built an ordering system where all I have to do is insert a new item and it is automagically updated and handled all over the place without any extra coding ever.  I don’t even have to do a database insert.  It’s all just done for me.  It’s really nice.

So some of the basic rules that I follow for no other reason than I have found them to work:

1) A script in a file does one thing.  Even if you think it should do x, y and z, it doesn’t.  If you coded it to do x, y and z all at the same time, one of those functions breaks on you, I have seen your code and your future, I know.  Trust me, one script, one purpose.

2) Create your directory structure BEFORE you write ANY code.  I generally include the following directories: page_elements, process, includes and templates.  This does not mean that you can’t expand, but generally 4 directories and root is the barest minimum.

3) If you think something should be an object, it probably should be.  Gee, I find myself pulling info from the database a row at a time an awful lot.  Should I make a row object?  Yes. You should.

4) One object, one file.  Don’t test me boy, see rule 1.

5) Break the system up into small, bite-sized pieces and create an API for plugins.  It can be rudimentary and even require a little code to plug the piece in, but you will save yourself a ton of work if you can just write the added feature without having to dig into anything else.

6) Figure out a layer structure and live by it.  I don’t care the model, just use it and make it work for you.  It doesn’t even have to be one of the widely recognized design patterns.  I use a home-grown MVC pattern myself and it works like a champ.

7) NO INLINE CSS! Yes, I have broken this rule from time to time, but eventually I go back and pull it out into a file

8) NO INLINE JAVSCRIPT! No, I haven’t broken this rule.  I understand that you have to put in even handlers where you want the script to fire, but your script should not live in the document.  Plus, who knows, you might want that toggle element display script somewhere other than in the single place you built it originally.

9) Break up your scripts and include them as needed.  Both CSS and Javascript should function properly where it is needed, but it should be excluded when not needed.  I know that some people write these monstrous CSS files with inline server-side scripts to add in the extra pieces when they are needed, but honestly, isn’t it easier on you and the server to just include files when they are needed and not load them at all when they are not?

10) Commenting!  You know that crazy function that you wrote which required bit-shifting to make it happen?  Remember how it took you three days to figure out how to do it?  It will take you 6 days to untangle what you did when you look at it again.  If you had to think about something before you wrote it, put in comments.  The person that ultimately follows after you will thank you, and that might just be YOU.

Very well, that is all.  Off with you.  Go about your programming and make the web a little better place.

Occam's Razor

Jan 10, 2008

I have a particular project that I work on every so often. It’s actually kind of a meta-project as I have to maintain a web-based project queue and management system, so it is a project for the sake of projects. Spiffy eh? Anyway, I haven’t had this thing break in a while which either means that I did such a nice, robust job of coding the darn thing that it is unbreakable (sure it is) or more likely, nobody has pushed this thing to the breaking point. Given enough time and enough monkeys. All of that aside, every so often, my boss comes up with new things that she would like the system to do, and I have to build them in. Fortunately, I built it in such a way that most everything just kind of “plugs in” not so much that I have an API and whatnot, but rather, I can simply build out a module and then just run an include and use it. Neat, isn’t it?

So, today I was told that she really wanted to be able to update team members on a project and then update the status of said users. Now, the way the thing works is you update the list of team members on a project and then edit the project again to set the status. This is a little cumbersome, we’ve discovered, simply because we don’t use the system the way we thought we would. Isn’t this always the case? So, my boss specifically, goes and toys with the team members as she is working on a project. This is dandy, except that she has to update the team and then go find the project again, right away. Not so good. What she asked for is a way to update the team and then immediately update the status of any given member of the new team list.

My first reaction, mentally, was ‘great, now I have to build out some crazy AJAX to go behind the scene, update the team list and then cobble together the list of the current team, push out some dynamic content to the page and then update things on the fly.’ This is not my idea of a good day. I could have spent all afternoon working on this. Now being the planner that I am, I sat back and thought about this. This promptly put me into a slight daze and I took about a 5-minute nap. When I woke up it dawned on me: the requirements I put together in my head were not what my boss asked for, they were what I interpreted. My solution still uses a little javascript, but now there are just 2 buttons. 1 says ‘save,’ the other says ‘save and exit.’ When you click save, everything you did gets saved and you are returned to the page. From there, the page automagically builds and includes all necessary pieces. If you click save and exit, everything you did will be saved and you will be pushed back to the main screen.

So, the takeaway from all of this is Occam’s razor applies very neatly to web projects. I love neat stuff that flies all over the screen and interacts with the server by making dynamic XHTML calls through activeX, but assuming all things are equal, the simplest answer is best. Why kill yourself and stress your server when you don’t need to?

  • Web Designers Rejoice: There is Still Room

    I’m taking a brief detour and talking about something other than user tolerance and action on your site. I read a couple of articles, which you’ve probably seen yourself, and felt a deep need to say something. Smashing Magazine published Does The Future Of The Internet Have Room For Web Designers? and the rebuttal, I Want To Be A Web Designer When I Grow Up, but something was missing.

  • Anticipating User Action

    Congrats, you’ve made it to the third part of my math-type exploration of anticipated user behavior on the web. Just a refresher, the last couple of posts were about user tolerance and anticipating falloff/satisficing These posts may have been a little dense and really math-heavy, but it’s been worth it, right?

  • Anticipating User Falloff

    As we discussed last week, users have a predictable tolerance for wait times through waiting for page loading and information seeking behaviors. The value you get when you calculate expected user tolerance can be useful by itself, but it would be better if you could actually predict the rough numbers of users who will fall off early and late in the wait/seek process.

  • User Frustration Tolerance on the Web

    I have been working for quite a while to devise a method for assessing web sites and the ability to provide two things. First, I want to assess the ability for a user to perform an action they want to perform. Second I want to assess the ability for the user to complete a business goal while completing their own goals.

  • Google Geocoding with CakePHP

    Google has some pretty neat toys for developers and CakePHP is a pretty friendly framework to quickly build applications on which is well supported. That said, when I went looking for a Google geocoding component, I was a little surprised to discover that nobody had created one to do the hand-shakey business between a CakePHP application and Google.

  • Small Inconveniences Matter

    Last night I was working on integrating oAuth consumers into Noisophile. This is the first time I had done something like this so I was reading all of the material I could to get the best idea for what I was about to do. I came across a blog post about oAuth and one particular way of managing the information passed back from Twitter and the like.

  • Know Thy Customer

    I’ve been tasked with an interesting problem: encourage the Creative department to migrate away from their current project tracking tool and into Jira. For those of you unfamiliar with Jira, it is a bug tracking tool with a bunch of toys and goodies built in to help keep track of everything from hours to subversion check-in number. From a developer’s point of view, there are more neat things than you could shake a stick at. From an outsider’s perspective, it is a big, complicated and confusing system with more secrets and challenges than one could ever imagine.

  • When SEO Goes Bad

    My last post was about finding a healthy balance between client- and server-side technology. My friend sent me a link to an article about SEO and Google’s “reasonable surfer” patent. Though the information regarding Google’s methods for identifying and appropriately assessing useful links on a site was interesting, I am quite concerned about what the SEO crowd was encouraging because of this new revelation.

  • Balance is Everything

    Earlier this year I discussed progressive enhancement, and proposed that a web site should perform the core functions without any frills. Last night I had a discussion with a friend, regarding this very same topic. It came to light that it wasn’t clear where the boundaries should be drawn. Interaction needs to be a blend of server- and client-side technologies.

  • Coding Transparency: Development from Design Comps

    Since I am an engineer first and a designer second in my job, more often than not the designs you see came from someone else’s comp. Being that I am a designer second, it means that I know just enough about design to be dangerous but not enough to be really effective over the long run.

  • Usabilibloat or Websites Gone Wild

    It’s always great when you have the opportunity to built a site from the ground up. You have opportunities to design things right the first time, and set standards in place for future users, designers and developers alike. These are the good times.

  • Thinking in Pieces: Modularity and Problem Solving

    I am big on modularity. There are lots of problems on the web to fix and modularity applies to many of them. A couple of posts ago I talked about content and that it is all built on or made of objects. The benefits from working with objectified content is the ease of updating and the breadth and depth of content that can be added to the site.

  • Almost Pretty: URL Rewriting and Guessability

    Through all of the usability, navigation, design, various user-related laws and a healthy handful of information and hierarchical tricks and skills, something that continues to elude designers and developers is pretty URLs. Mind you, SEO experts would balk at the idea that companies don’t think about using pretty URLs in order to drive search engine placement. There is something else to consider in the meanwhile:

  • Content: It's All About Objects

    When I wrote my first post about object-oriented content, I was thinking in a rather small scope. I said to myself, “I need content I can place where I need it, but I can edit once and update everything at the same time.” The answer seemed painfully clear: I need objects.

  • It's a Fidelity Thing: Stakeholders and Wireframes

    This morning I read a post about wireframes and when they are appropriate. Though I agree, audience is important, it is equally important to hand the correct items to the audience at the right times. This doesn’t mean you shouldn’t create wireframes.

  • Developing for Delivery: Separating UI from Business

    With the advent of Ruby on Rails (RoR or Rails) as well as many of the PHP frameworks available, MVC has become a regular buzzword. Everyone claims they work in an MVC fashion though, much like Agile development, it comes in various flavors and strengths.

  • I Didn't Expect THAT to Happen

    How many times have you been on a website and said those very words? You click on a menu item, expecting to have content appear in much the same way everything else did. Then, BANG you get fifteen new browser windows and a host of chirping, talking and other disastrous actions.

  • Degrading Behavior: Graceful Integration

    There has been a lot of talk about graceful degradation. In the end it can become a lot of lip service. Often people talk a good talk, but when the site hits the web, let’s just say it isn’t too pretty.

  • Website Overhaul 12-Step Program

    Suppose you’ve been tasked with overhauling your company website. This has been the source of dread and panic for creative and engineering teams the world over.

  • Pretend that they're Users

    Working closely with the Creative team, as I do, I have the unique opportunity to consider user experience through the life of the project. More than many engineers, I work directly with the user. Developing wireframes, considering information architecture and user experience development all fall within my purview.

  • User Experience Means Everyone

    I’ve been working on a project for an internal client, which includes linking out to various medical search utilities. One of the sites we are using as a search provider offers pharmacy searches. The site was built on ASP.Net technology, or so I would assume as all the file extensions are ‘aspx.’ I bring this provider up because I was shocked and appalled by their disregard for the users that would be searching.

  • Predictive User Self-Selection

    Some sites, like this one, have a reasonably focused audience. It can become problematic, however, for corporate sites to sort out their users, and lead them to the path of enlightenment. In the worst situations, it may be a little like throwing stones into the dark, hoping to hit a matchstick. In the best, users will wander in and tell you precisely who they are.

  • Mapping the Course: XML Sitemaps

    I just read a short, relatively old blog post by David Naylor regarding why he believes XML sitemaps are bad. People involved with SEO probably know and recognize the name. I know I did. I have to disagree with his premise, but agree with his argument.

  • The Browser Clipping Point

    Today, at the time of this writing, Google posted a blog stating they were dropping support for old browsers. They stated:

  • Creativity Kills

    People are creative. It’s a fact of the state of humanity. People want to make things. It’s built into the human condition. But there is a difference between haphazard creation and focused, goal-oriented development.

  • Reactionary Navigation: The Sins of the Broad and Shallow

    When given a task of making search terms and frequetly visited pages more accessible to users, the uninitiated fire and fall back. They leave in their wake, broad, shallow sites with menus and navigtion which look more like weeds than an organized system. Ultimately , these navigation schemes fail to do the one thing they were intended for, enhance findability.

  • OOC: Object Oriented Content

    Most content on the web is managed at the page level. Though I cannot say that all systems behave in one specific way, I do know that each system I’ve used behaves precisely like this. Content management systems assume that every new piece of content which is created is going to, ultimately, have a page that is dedicated to that piece of content. Ultimately all content is going to live autonomously on a page. Content, much like web pages, is not an island.

  • Party in the Front, Business in the Back

    Nothing like a nod to the reverse mullet to start a post out right. As I started making notes on a post about findability, something occurred to me. Though it should seem obvious, truly separating presentation from business logic is key in ensuring usability and ease of maintenance. Several benefits can be gained with the separation of business and presentation logic including wiring for a strong site architecture, solid, clear HTML with minimal outside code interfering and the ability to integrate a smart, smooth user experience without concern of breaking the business logic that drives it.

  • The Selection Correction

    User self selection is a mess. Let’s get that out in the open first and foremost. As soon as you ask the user questions about themselves directly, your plan has failed. User self selection, at best, is a mess of splash pages and strange buttons. The web has become a smarter place where designers and developers should be able to glean the information they need about the user without asking the user directly.

  • Ah, Simplicity

    Every time I wander the web I seem to find it more complicated than the last time I left it.  Considering this happens on a daily basis, the complexity appears to be growing monotonically.  It has been shown again and again that the attention span of people on the web is extremely short.  A good example of this is a post on Reputation Defender about the click-through rate on their search results.

  • It's Called SEO and You Should Try Some

    It’s been a while since I last posted, but this bears note. Search engine optimization, commonly called SEO, is all about getting search engines to notice you and people to come to your site. The important thing about good SEO is that it will do more than simply get eyes on your site, but it will get the RIGHT eyes on your site. People typically misunderstand the value of optimizing their site or they think that it will radically alter the layout, message or other core elements they hold dear.

  • Information and the state of the web

    I only post here occasionally and it has crossed my mind that I might almost be wise to just create a separate blog on my web server.  I have these thoughts and then I realize that I don’t have time to muck with that when I have good blog content to post, or perhaps it is simply laziness.  Either way, I only post when something strikes me.

  • Browser Wars

    It’s been a while since I have posted. I know. For those of you that are checking out this blog for the first time, welcome. For those of you who have read my posts before, welcome back. We’re not here to talk about the regularity (or lack thereof) that I post with. What we are here to talk about is supporting or not supporting browsers. So first, what inspired me to write this? Well… this:

  • Web Scripting and you

    If there is one thing that I feel can be best learned from programming for the internet it’s modularity.  Programmers preach modularity through encapsulation and design models but ultimately sometimes it’s really easy to just throw in a hacky fix and be done with the whole mess.  Welcome to the “I need this fix last week” school of code updating.  Honestly, that kind of thing happens to the best of us.

  • Occam's Razor

    I have a particular project that I work on every so often. It’s actually kind of a meta-project as I have to maintain a web-based project queue and management system, so it is a project for the sake of projects. Spiffy eh? Anyway, I haven’t had this thing break in a while which either means that I did such a nice, robust job of coding the darn thing that it is unbreakable (sure it is) or more likely, nobody has pushed this thing to the breaking point. Given enough time and enough monkeys. All of that aside, every so often, my boss comes up with new things that she would like the system to do, and I have to build them in. Fortunately, I built it in such a way that most everything just kind of “plugs in” not so much that I have an API and whatnot, but rather, I can simply build out a module and then just run an include and use it. Neat, isn’t it?

  • Inflexible XML data structures

    Happy new year! Going into the start of the new year, I have a project that has carried over from the moment I started my current job. I am working on the information architecture and interaction design of a web-based insurance tool. Something that I have run into recently is a document structure that was developed using XML containers. This, in and of itself, is not an issue. XML is a wonderful tool for dividing information up in a useful way. The problem lies in how the system is implemented. This, my friends, is where I ran into trouble with a particular detail in this project. Call it the proverbial bump in the road.

  • Accessibility and graceful degradation

    Something that I have learnt over time is how to make your site accessible for people that don’t have your perfect 20/20 vision, are working from a limited environment or just generally have old browsing capabilities. Believe it or not, people that visit my web sites still use old computers with old copies of Windows. Personally, I have made the Linux switch everywhere I can. That being said, I spend a certain amount of time surfing the web using Lynx. This is not due to the fact that I don’t have a GUI in Linux. I do. And I use firefox for my usual needs, but Lynx has a certain special place in my heart. It is in a class of browser that sees the web in much the same way that a screen reader does. For example, all of those really neat iframes that you use for dynamic content? Yeah, those come up as “iframe.” Totally unreadable. Totally unreachable. Iframe is an example of web technology that is web-inaccessible. Translate this as bad news.

  • Less is less, more is more. You do the math.

    By this I don’t mean that you should fill every pixel on the screen with text, information and blinking, distracting graphics. What I really mean is that you should give yourself more time to accomplish what you are looking to do on the web. Sure, your reaction to this is going to be “duh, of course you should spend time thinking about what you are going to do online. All good jobs take time.” I say, oh young one, are you actually spending time where it needs to be spent? I suspect you aren’t.

  • Note to self, scope is important.

    Being that this was an issue just last evening, I thought I would share something that I have encountered when writing Javascript scripts.  First of all, let me state that Javascript syntax is extremely forgiving.  You can do all kinds of  unorthodox declarations of variables as well as use variables in all kinds of strange ways.  You can take a variable, store a string in it, then a number, then an object and then back again.  Weakly typed would be the gaming phrase.  The one thing that I would like to note, as it was my big issue last evening, is scope of your variables.  So long as you are careful about defining the scope of any given variable then you are ok, if not, you could have a problem just like I did.  So, let’s start with scope and how it works.

  • Subscribe

    -->