Reactionary Navigation: The Sins of the Broad and Shallow

Jan 18, 2010

When given a task of making search terms and frequetly visited pages more accessible to users, the uninitiated fire and fall back. They leave in their wake, broad, shallow sites with menus and navigtion which look more like weeds than an organized system. Ultimately , these navigation schemes fail to do the one thing they were intended for, enhance findability.

Though one of my latest projects was the final straw, prompting this post, I’ve seen teams approach sites with the goal of findability and navigability in mind, only to end up with a system of menus and a field of links that are almost impenetrable to even the most tenacious of webonauts. Documents, pages and external links mingle in a taxonomic and architectural nightmare.

Perceived site architecture is to blame for this iniquity, regardless of the real information hierarchy. Although broad and shallow architecture is fine for small, simple sites, it is unforgiving as the site grows and the number of pages needed to contain all of the information provided balloons up.

Broad and shallow architecture is precsiely what it sounds like. Instead of crafting a set of taxonomical structures and an architecture that reflects the hierarchy of information on the site, broad and shallow architecture offers all pages at the same level and provides no understanding of the interrelation between pages and the information they contain.

When search and analytics data is taken without proper insight, it can quickly become confusing to try and unravel the intent of the users visiting a site. Users search for strange items and land on pages that may not reflect what their intent was originally. Often, frustration mounts and they will search for anything that seems related to what they want. Ultimately, the user will become discouraged and leave the site angry and unfulfilled. Angry users are never return customers.

These frantic searches can lead to unexpected search terms. Someone who is unskilled in assessing user data is likely to assume that pages need to be accessible directly from the home page of a site. Eventually buildup occurs and ever page becomes a direct link from the home page. When this happens, a broad and shallow architecture emerges from the mire. With Draconian enforcement, teams will inflict “usability” upon the user in heaps and gobs.

The only way I know to solve this kind of problem is to strip a site down to nothing and begin again. It’s a hard pill to swallow and many teams respond horribly to this kind of news. I typically revel in this kind of situation because it gives the site new hope for a fresh beginning.

The best thing any team can do is uncover a clear hierarchy and stick to their guns. Often, it’s not the information hierarchy, but the navigation architecture and highlighted links that kill the user experience faster than anything else. Undestanding information importance and subordinance will always provide for a solid foundation to build a site with longevity and scalability.

After a solid, clear hierarchy has been laid in place, select clear, descriptive language to describe the categorizaion. Be certain you are using the user’s parlance. Review search terms, both internal as well as search engine referer sourced, to select the right verbiage. These carefully selected, key terms will be fundamental in guiding your user in a comfortable, transparent manner that will provide comfort to their experience.

Find key terms which are most searched for and focus on guiding your users there. Most often the users that are searching on your site are not finding what they are looking for. Even if the pages are clearly defined in the information hierarchy, the path to arrive there may not be so clear. Provide road signs for the user to follow.

Signs on your site should be sparse, much like signs found describing the roads on which you drive. Don’t inundate your user with directions to get everywhere. They have a goal. Find out what it is and lead them to the promised land. Guide them gently and let their discovery feel like their success.

Ultimately, large sites should avoid the broad and shallow approach, opting for a narrow and deep approach instead. Give signs along the way to ensure the user experiences incremental success. Guide your user gently and let their success be a rewarding experience they will look forward to recreating when they return next.

OOC: Object Oriented Content

Jan 15, 2010

Most content on the web is managed at the page level. Though I cannot say that all systems behave in one specific way, I do know that each system I’ve used behaves precisely like this. Content management systems assume that every new piece of content which is created is going to, ultimately, have a page that is dedicated to that piece of content. Ultimately all content is going to live autonomously on a page. Content, much like web pages, is not an island.

Six months or a year ago, I had an epiphany. Content can be handled much like programming, i.e. in an object-oriented manner. Web sites often have repeating elements which could be broken out into individual pieces and reused throughout the site. These pieces could be considered objects in their own right, and they would share quite a bit with objects in programming. After building a proprietary Content Management System around this concept, I coined the phrase “Object Oriented Content.”

Object Oriented Programming (OOP) has roots dating back as far as the 1960’s and came into common use in the early 1990’s, according to Wikipedia. Since its widespread adoption, OOP has become commonplace among engineers and is expected to be part of a programmer’s standard arsenal. Though object orientation (OO) has become commonplace with engineering professionals, some of the inherent benefits of OO have been overlooked in other, non-geek circles, especially within creative groups.

Though content will never demonstrate all of the principles found in programming, as it is written copy and not a programming language, there are some striking similarities between OOC and OOP. Principles such as encapsulation, inheritance and abstraction come to light as content is broken into objects and removed from the page in which it will be displayed.

Content, once broken into objects, is an abstraction from the page it was intended to be displayed on. We can look at this as, distinct copy is an instance of the object content and content is what goes on a page to solve the content question on the web. In English, this means any copy I create is, ultimately, content. A content object is an abstract idea that is used to answer the question of content on the web.

This abstraction of content from the page provides great power in managing a website and trimming time off the process of maintaining a website. Content objects can be reused again and again throughout your site. Moreover, since the content is not tied directly to a particular page, it offers greater flexibility in the operation of gathering and presenting content on the web. The power comes from the ability to edit content in a single place in the system and update across an entire site, or across multiple sites.

A content object is, by its very nature, encapsulated. All parts of the content are maintained within the content object and no part of the content is outside of the object. The site page on which your content is to be displayed is totally unaware of the content contained within the object, but simply that it is a chunk ready for human consumption.

For the non-programmer, this means your end display does not look for what kind of content is being displayed or what the content says specifically. Your client-facing site simply receives a prepared object and displays it as it is, without meddling in the affairs of the copy writer and editor.

Ultimately, display properties may vary based on things like CSS and container wrappers in your site, but the content, itself, will remain wholly intact and unedited. This translates to a high-fidelity in content presentation based upon what the author intended.

Finally we will look at content inheritance. Content inheritance must work differently than programming inheritance. Engineers will argue that what I am offering here is not the same effect, but for copy writers and editors the world over, this will be a great benefit for you.

If you create a content instance and store copy in it, you can reuse it. This we know. Within my proprietary system, you can also inherit content from an existing instance. Once content is stored, it can be included in a page. Suppose you would like to make a change to your content, but only on one page. You can clone the content and edit it accordingly. What you’ve done is inherited from the original content, but modified it to suit the new use.

There is an issue with this definition of inheritance. If you modify the original content, your modified content does not change with it. This is, however, a boon to your editor as they expect content to behave this way. If your modifications change, or revert to some other form, it would lead to a great deal of frustration.

In the end, moving away from a static page-content model and to a more flexible and fluid content object model provides a great deal of power and ease when preparing a site for production. From the creation of content, which can be reused, to editing content where a single change can affect multiple pages, the process of updating content prepared for the web because fast and easy, allowing all parties to spend less time managing pages and more time doing what they specialize in, providing content to the user. Consider applying this approach to your site and make the web a better place.

Party in the Front, Business in the Back

Jan 14, 2010

Nothing like a nod to the reverse mullet to start a post out right. As I started making notes on a post about findability, something occurred to me. Though it should seem obvious, truly separating presentation from business logic is key in ensuring usability and ease of maintenance. Several benefits can be gained with the separation of business and presentation logic including wiring for a strong site architecture, solid, clear HTML with minimal outside code interfering and the ability to integrate a smart, smooth user experience without concern of breaking the business logic that drives it.

The benefit that engineers will appreciate is the ease of maintainability. With business logic abstracted from the presenation, engineers can maintain the infrastructure without worrying about breaking the look and feel of the client experience. This alleviates stress and sleepless nights they might experience otherwise.

I just finished building the beta version of a Content Managment System for the company I am with. I built a multi-layer system including two distinct uses of the Model-View-Controller design pattern. The first instance was the underlying system that actually maintained the content, page information, taxonomy, hierarchy and page templates. The second was more abstracted from the classic MVC pattern.

The CMS control interface, in my abstract version, is the model. The API is the controller and it handles requests for display data, stores input from. Finally, a very lightweight templating system is the view. The templating system is actually built to act as a fully functional site on its own, built around a business logic/presentation theme. The entire site can run on a separate machine from the CMS or API.

The benefit to all of this is releasing the presentation which the user interacts with from the confines of business logic which can slow a site down and muddy the response of an interactive system. Flexibility is the path to enlightenment.

With the presentation decoupled, it offers a unique opportunity to concern myself only with what the user will interact with. With some creative CSS and good, solid HTML I can build an experience my users will look forward to. Also, with a lighter template to render, the site response should be snappier, leaving the user free to concern themselves with what they came for, content.

Because much of the template is pure HTML, the task of search engine optimization becomes trivial. The HTML validates better and the content hierarchy is easier to develop. This means search engines will be more likely to get a clear picture of what the site is like and how the content interrelates.

The benefit of keeping the business logic tucked away behind an API is, regardless of how it changes or functions, the user will have a seamless, predictable experience. This builds trust between the user and your organization, which increases the liklihood they will stay longer and visit again.

Finally, all of this positive user experience would not be possible without a solid design. Since the business logic does not interact or interfere with the design, the designer is empowered to make bold moves and guide the user through the site in a way that might not have been possible if user interface were tightly coupled with the code that drives it.

In the end, decoupling the business logic from the presentation allows us to move back to a simpler time when the web was primarily HTML, except you’ve got more than just bronze-age tools to build with. With increased usability, maintenance and findability, your site will feel more like a smooth, clean experience and less like a clunky tool straight out of the mid-ninties. Presentation decoupling makes for happy users and everyone wants happy users. Go make the web a better place.

The Selection Correction

Jan 11, 2010

User self selection is a mess. Let’s get that out in the open first and foremost. As soon as you ask the user questions about themselves directly, your plan has failed. User self selection, at best, is a mess of splash pages and strange buttons. The web has become a smarter place where designers and developers should be able to glean the information they need about the user without asking the user directly.

The innate problem with asking the user about what they want is, they will invariably give you the wrong answer. Sometimes it happens because they don’t know what they want. Sometimes they don’t care. Sometimes they misunderstand what you really want to know and sometimes they flat out lie to see what happens.

The question has hit my desk a few times now, “how does the user self-select in a nice, fluid manner?” The answer: they don’t.

It occurred to me, while pondering this question, that it wasn’t the answer that was a problem. It was the question. I don’t want to know how the user self-selects. I want to know how I am to select the user. The question is flawed.

The first thing I did, once I awoke from the haze and saw the truth for what it was, I reformulated the question. How do I select the user before they get to the site?

It sounds like I am preparing for a life of mind reading. No computer will tell you much, if anything about your user. Do they like cats or do they like laser blasters? The computer neither knows, nor cares. All you get is an IP address, an operating system and browser info.

So, how would one approach the user selection issue? Is it a design concern or a development concern? Yes.

The developer can say a lot about the computer the user is using, the place they came from to get to your site and where they landed on your site. The designer can pick up the stragglers and put them on the path to user experience redemption.

First the developer must work their magic. Given where the user came from and what page they landed on, the developer can make predictions about what they are going to want to do next. These predictions are essential to handling the user experience moving forward.

If you have more than one type of user coming to your site, it is helpful to understand what they did to arrive. If they typed in the address manually, or clicked on a link in their e-mail, the resulting behavior is almost certain. They are there for a purpose and it would be wise to get out of their way.

If, on the other hand, the user arrived via search, the search terms will probably be in the referrer URL that is passed along with the GET request. If this is too much tech speak, think of it this way: the browser tells you what they searched for and you can use that to guide your user.

Any other in-bound links will tell you the user is interested in the page they clicked through to. This is especially true if they were not mislead to believe the page is something it’s not. Your SEO skills would come in handy for that little task. If you know the user is interested in a particular product or area, you can use that for opportunities to cross- or up-sell.

Once the user it on your site, the battle is half over, or only half over, depending on your outlook. Since you know something about your user, you can guide them. If, on the other hand, the user arrives mistakenly, on the wrong page, they need to find a way to get out of their mess.

This is where design and client-side architecture come into play. Users typically behave in a click, back, click, back manner. They click a link then, if it is not what they wanted, they click the back button. It is your job, O noble chess player, to stave off that back-click at the expense of life and limb.

Make it easy to find the way to the right place on the site. “Is this not what you wanted? Why not try this?” It’s a fantastic way to lead the user where you want them to go. Make their journey one that ends at Mecca.

There are two wonderful side-effects of pre-selecting your user and their journey. The first is you can streamline the architecture of your site to match precise needs and exclude the train wreck “features” that balloon into eventual site clutter. Secondly, you can spend more time solving the problem of how to handle the edge-case users, leaving the straight-and-narrow users alone to complete their journey as effortlessly as they please.

In the end, the question of how a user self-selects should undergo great scrutiny before it is passed off as a primary goal of the site development process. Think carefully and consider, not only the visual elements of the site, but the outside influences that make up the ecosystem you are about to interact with. Consider your user before building the site. Your users will thank you for it.

Ah, Simplicity

Jan 8, 2010

Every time I wander the web I seem to find it more complicated than the last time I left it.  Considering this happens on a daily basis, the complexity appears to be growing monotonically.  It has been shown again and again that the attention span of people on the web is extremely short.  A good example of this is a post on Reputation Defender about the click-through rate on their search results.

I was discussing these two aspects of the web with the graphic designer at my work and we seemed to agree that all evidence points to the growing trends of complexity and short attention spans. Then we had something of a revelation. Perhaps there is a correlation. Is it possible that the ever increasing complexity of the web and the numerous sites which live there are encouraging the limited attention of users? Perhaps it’s the other way around and short attention spans affect choice to add extra elements to an already architecture-overburdened site.

Without any solid evidence or support, I have a tendency to lean toward the complexity of web sites as a contributing factor to ever shorter attention spans. It has also been shown that people who are multitasking perform each task less efficiently than if they had focused on a single task through completion.

It seems, with such a claim. that complexity in web design and architecture would, inherently. Lead to poor focus and retention. Understandably, there are multiple factors that play into a site with an overburdened architecture. Not only is there the desire to encourage users to remain on the site, which drives a desire to present more compelling content, but there can be executive pressure to maintain certain elements on a page which may not serve the user’s purpose.

One thing that seems to make for better selling products, time and time again, is taking something that already exists and making it easier to use. Apple is a great example of this. They didn’t invent the computer, the MP3 player, the cellular phone or any other technology they sell. What they did was create something that was more appealing to the user.

This may be the key to getting ahead on the web as well. Suppose we packed more features into a site, but did it in a smarter way. The web site will never be as intuitive as the cheeseburger or the nipple, but if we could eliminate some of the clutter in lieu of a more progressive experience user retention and attention might take to the rise again.

From a user-experience and developer perspective (as opposed to a graphic design perspective) I argue that the first thing we could benefit from is stripping away the nonsense. Suppose we assumed that a blog would actually do what it was intended to do and present information in an easy-to-digest fashion.

Think of it as eating a fresh apple rather than apple pie. When you eat the apple straight as it came from the tree, you can predict roughly what you will get. It will be crunchy, sweet and apple flavored. Now, suppose you are eating a piece of apple pie. It is much more challenging to anticipate precisely what you will get. There are spices and sugar and a crust that all get in the way of the same tasty apple you wanted to eat. Now sometimes apple pie is just right, but just as often, a single apple would offer a more consistently good experience.

Once you nail down the basic site, you have moved back to the more nutritious and significantly less complicated apple. From there, we can think of enhancements as a genetically modified apple, destined for better flavor and crunch than you could ever find in nature. It’s the hyper-apple.

Different sites will require different focusing and careful pruning, but I have seen very few sites that could do with more clutter and complexity. Conversely, I’ve seen plenty of sites that could stand for an architecture trim and shape.

I took this very approach to heart when preparing the presentation of this site. At first blush, there appear to be elements distinctly missing, for instance, a sidebar. Perhaps this was a poor choice, but I don’t think anyone will miss it. A simple majority of the web-using populace is search centric anyway. I have made sure to leave the search bar easily accessible, but I have eliminated the archives links which are so ubiquitous on most blogs today.

I could discuss all of the various features and plugins I used to create the experience you now see, but the details are beside the point. In the end, either the user experience is a good one or a bad one. So far, the stats for this site seem to reflect an experience that has kept users on the site for about five minutes at a time and they have visited 3 or 4 pages per visit. Perhaps these stats aren’t the best, but I’m not going to complain about a bounce rate of less than 50% at the moment.

The next time you are looking at your own site, perhaps you will think about what could use a little trimming and, together, we can make a simpler, more exciting and engaging experience for our users. In the end, all I ask is that you do your part and help make the web a better place.

  • Web Designers Rejoice: There is Still Room

    I’m taking a brief detour and talking about something other than user tolerance and action on your site. I read a couple of articles, which you’ve probably seen yourself, and felt a deep need to say something. Smashing Magazine published Does The Future Of The Internet Have Room For Web Designers? and the rebuttal, I Want To Be A Web Designer When I Grow Up, but something was missing.

  • Anticipating User Action

    Congrats, you’ve made it to the third part of my math-type exploration of anticipated user behavior on the web. Just a refresher, the last couple of posts were about user tolerance and anticipating falloff/satisficing These posts may have been a little dense and really math-heavy, but it’s been worth it, right?

  • Anticipating User Falloff

    As we discussed last week, users have a predictable tolerance for wait times through waiting for page loading and information seeking behaviors. The value you get when you calculate expected user tolerance can be useful by itself, but it would be better if you could actually predict the rough numbers of users who will fall off early and late in the wait/seek process.

  • User Frustration Tolerance on the Web

    I have been working for quite a while to devise a method for assessing web sites and the ability to provide two things. First, I want to assess the ability for a user to perform an action they want to perform. Second I want to assess the ability for the user to complete a business goal while completing their own goals.

  • Google Geocoding with CakePHP

    Google has some pretty neat toys for developers and CakePHP is a pretty friendly framework to quickly build applications on which is well supported. That said, when I went looking for a Google geocoding component, I was a little surprised to discover that nobody had created one to do the hand-shakey business between a CakePHP application and Google.

  • Small Inconveniences Matter

    Last night I was working on integrating oAuth consumers into Noisophile. This is the first time I had done something like this so I was reading all of the material I could to get the best idea for what I was about to do. I came across a blog post about oAuth and one particular way of managing the information passed back from Twitter and the like.

  • Know Thy Customer

    I’ve been tasked with an interesting problem: encourage the Creative department to migrate away from their current project tracking tool and into Jira. For those of you unfamiliar with Jira, it is a bug tracking tool with a bunch of toys and goodies built in to help keep track of everything from hours to subversion check-in number. From a developer’s point of view, there are more neat things than you could shake a stick at. From an outsider’s perspective, it is a big, complicated and confusing system with more secrets and challenges than one could ever imagine.

  • When SEO Goes Bad

    My last post was about finding a healthy balance between client- and server-side technology. My friend sent me a link to an article about SEO and Google’s “reasonable surfer” patent. Though the information regarding Google’s methods for identifying and appropriately assessing useful links on a site was interesting, I am quite concerned about what the SEO crowd was encouraging because of this new revelation.

  • Balance is Everything

    Earlier this year I discussed progressive enhancement, and proposed that a web site should perform the core functions without any frills. Last night I had a discussion with a friend, regarding this very same topic. It came to light that it wasn’t clear where the boundaries should be drawn. Interaction needs to be a blend of server- and client-side technologies.

  • Coding Transparency: Development from Design Comps

    Since I am an engineer first and a designer second in my job, more often than not the designs you see came from someone else’s comp. Being that I am a designer second, it means that I know just enough about design to be dangerous but not enough to be really effective over the long run.

  • Usabilibloat or Websites Gone Wild

    It’s always great when you have the opportunity to built a site from the ground up. You have opportunities to design things right the first time, and set standards in place for future users, designers and developers alike. These are the good times.

  • Thinking in Pieces: Modularity and Problem Solving

    I am big on modularity. There are lots of problems on the web to fix and modularity applies to many of them. A couple of posts ago I talked about content and that it is all built on or made of objects. The benefits from working with objectified content is the ease of updating and the breadth and depth of content that can be added to the site.

  • Almost Pretty: URL Rewriting and Guessability

    Through all of the usability, navigation, design, various user-related laws and a healthy handful of information and hierarchical tricks and skills, something that continues to elude designers and developers is pretty URLs. Mind you, SEO experts would balk at the idea that companies don’t think about using pretty URLs in order to drive search engine placement. There is something else to consider in the meanwhile:

  • Content: It's All About Objects

    When I wrote my first post about object-oriented content, I was thinking in a rather small scope. I said to myself, “I need content I can place where I need it, but I can edit once and update everything at the same time.” The answer seemed painfully clear: I need objects.

  • It's a Fidelity Thing: Stakeholders and Wireframes

    This morning I read a post about wireframes and when they are appropriate. Though I agree, audience is important, it is equally important to hand the correct items to the audience at the right times. This doesn’t mean you shouldn’t create wireframes.

  • Developing for Delivery: Separating UI from Business

    With the advent of Ruby on Rails (RoR or Rails) as well as many of the PHP frameworks available, MVC has become a regular buzzword. Everyone claims they work in an MVC fashion though, much like Agile development, it comes in various flavors and strengths.

  • I Didn't Expect THAT to Happen

    How many times have you been on a website and said those very words? You click on a menu item, expecting to have content appear in much the same way everything else did. Then, BANG you get fifteen new browser windows and a host of chirping, talking and other disastrous actions.

  • Degrading Behavior: Graceful Integration

    There has been a lot of talk about graceful degradation. In the end it can become a lot of lip service. Often people talk a good talk, but when the site hits the web, let’s just say it isn’t too pretty.

  • Website Overhaul 12-Step Program

    Suppose you’ve been tasked with overhauling your company website. This has been the source of dread and panic for creative and engineering teams the world over.

  • Pretend that they're Users

    Working closely with the Creative team, as I do, I have the unique opportunity to consider user experience through the life of the project. More than many engineers, I work directly with the user. Developing wireframes, considering information architecture and user experience development all fall within my purview.

  • User Experience Means Everyone

    I’ve been working on a project for an internal client, which includes linking out to various medical search utilities. One of the sites we are using as a search provider offers pharmacy searches. The site was built on ASP.Net technology, or so I would assume as all the file extensions are ‘aspx.’ I bring this provider up because I was shocked and appalled by their disregard for the users that would be searching.

  • Predictive User Self-Selection

    Some sites, like this one, have a reasonably focused audience. It can become problematic, however, for corporate sites to sort out their users, and lead them to the path of enlightenment. In the worst situations, it may be a little like throwing stones into the dark, hoping to hit a matchstick. In the best, users will wander in and tell you precisely who they are.

  • Mapping the Course: XML Sitemaps

    I just read a short, relatively old blog post by David Naylor regarding why he believes XML sitemaps are bad. People involved with SEO probably know and recognize the name. I know I did. I have to disagree with his premise, but agree with his argument.

  • The Browser Clipping Point

    Today, at the time of this writing, Google posted a blog stating they were dropping support for old browsers. They stated:

  • Creativity Kills

    People are creative. It’s a fact of the state of humanity. People want to make things. It’s built into the human condition. But there is a difference between haphazard creation and focused, goal-oriented development.

  • Reactionary Navigation: The Sins of the Broad and Shallow

    When given a task of making search terms and frequetly visited pages more accessible to users, the uninitiated fire and fall back. They leave in their wake, broad, shallow sites with menus and navigtion which look more like weeds than an organized system. Ultimately , these navigation schemes fail to do the one thing they were intended for, enhance findability.

  • OOC: Object Oriented Content

    Most content on the web is managed at the page level. Though I cannot say that all systems behave in one specific way, I do know that each system I’ve used behaves precisely like this. Content management systems assume that every new piece of content which is created is going to, ultimately, have a page that is dedicated to that piece of content. Ultimately all content is going to live autonomously on a page. Content, much like web pages, is not an island.

  • Party in the Front, Business in the Back

    Nothing like a nod to the reverse mullet to start a post out right. As I started making notes on a post about findability, something occurred to me. Though it should seem obvious, truly separating presentation from business logic is key in ensuring usability and ease of maintenance. Several benefits can be gained with the separation of business and presentation logic including wiring for a strong site architecture, solid, clear HTML with minimal outside code interfering and the ability to integrate a smart, smooth user experience without concern of breaking the business logic that drives it.

  • The Selection Correction

    User self selection is a mess. Let’s get that out in the open first and foremost. As soon as you ask the user questions about themselves directly, your plan has failed. User self selection, at best, is a mess of splash pages and strange buttons. The web has become a smarter place where designers and developers should be able to glean the information they need about the user without asking the user directly.

  • Ah, Simplicity

    Every time I wander the web I seem to find it more complicated than the last time I left it.  Considering this happens on a daily basis, the complexity appears to be growing monotonically.  It has been shown again and again that the attention span of people on the web is extremely short.  A good example of this is a post on Reputation Defender about the click-through rate on their search results.

  • It's Called SEO and You Should Try Some

    It’s been a while since I last posted, but this bears note. Search engine optimization, commonly called SEO, is all about getting search engines to notice you and people to come to your site. The important thing about good SEO is that it will do more than simply get eyes on your site, but it will get the RIGHT eyes on your site. People typically misunderstand the value of optimizing their site or they think that it will radically alter the layout, message or other core elements they hold dear.

  • Information and the state of the web

    I only post here occasionally and it has crossed my mind that I might almost be wise to just create a separate blog on my web server.  I have these thoughts and then I realize that I don’t have time to muck with that when I have good blog content to post, or perhaps it is simply laziness.  Either way, I only post when something strikes me.

  • Browser Wars

    It’s been a while since I have posted. I know. For those of you that are checking out this blog for the first time, welcome. For those of you who have read my posts before, welcome back. We’re not here to talk about the regularity (or lack thereof) that I post with. What we are here to talk about is supporting or not supporting browsers. So first, what inspired me to write this? Well… this:

  • Web Scripting and you

    If there is one thing that I feel can be best learned from programming for the internet it’s modularity.  Programmers preach modularity through encapsulation and design models but ultimately sometimes it’s really easy to just throw in a hacky fix and be done with the whole mess.  Welcome to the “I need this fix last week” school of code updating.  Honestly, that kind of thing happens to the best of us.

  • Occam's Razor

    I have a particular project that I work on every so often. It’s actually kind of a meta-project as I have to maintain a web-based project queue and management system, so it is a project for the sake of projects. Spiffy eh? Anyway, I haven’t had this thing break in a while which either means that I did such a nice, robust job of coding the darn thing that it is unbreakable (sure it is) or more likely, nobody has pushed this thing to the breaking point. Given enough time and enough monkeys. All of that aside, every so often, my boss comes up with new things that she would like the system to do, and I have to build them in. Fortunately, I built it in such a way that most everything just kind of “plugs in” not so much that I have an API and whatnot, but rather, I can simply build out a module and then just run an include and use it. Neat, isn’t it?

  • Inflexible XML data structures

    Happy new year! Going into the start of the new year, I have a project that has carried over from the moment I started my current job. I am working on the information architecture and interaction design of a web-based insurance tool. Something that I have run into recently is a document structure that was developed using XML containers. This, in and of itself, is not an issue. XML is a wonderful tool for dividing information up in a useful way. The problem lies in how the system is implemented. This, my friends, is where I ran into trouble with a particular detail in this project. Call it the proverbial bump in the road.

  • Accessibility and graceful degradation

    Something that I have learnt over time is how to make your site accessible for people that don’t have your perfect 20/20 vision, are working from a limited environment or just generally have old browsing capabilities. Believe it or not, people that visit my web sites still use old computers with old copies of Windows. Personally, I have made the Linux switch everywhere I can. That being said, I spend a certain amount of time surfing the web using Lynx. This is not due to the fact that I don’t have a GUI in Linux. I do. And I use firefox for my usual needs, but Lynx has a certain special place in my heart. It is in a class of browser that sees the web in much the same way that a screen reader does. For example, all of those really neat iframes that you use for dynamic content? Yeah, those come up as “iframe.” Totally unreadable. Totally unreachable. Iframe is an example of web technology that is web-inaccessible. Translate this as bad news.

  • Less is less, more is more. You do the math.

    By this I don’t mean that you should fill every pixel on the screen with text, information and blinking, distracting graphics. What I really mean is that you should give yourself more time to accomplish what you are looking to do on the web. Sure, your reaction to this is going to be “duh, of course you should spend time thinking about what you are going to do online. All good jobs take time.” I say, oh young one, are you actually spending time where it needs to be spent? I suspect you aren’t.

  • Note to self, scope is important.

    Being that this was an issue just last evening, I thought I would share something that I have encountered when writing Javascript scripts.  First of all, let me state that Javascript syntax is extremely forgiving.  You can do all kinds of  unorthodox declarations of variables as well as use variables in all kinds of strange ways.  You can take a variable, store a string in it, then a number, then an object and then back again.  Weakly typed would be the gaming phrase.  The one thing that I would like to note, as it was my big issue last evening, is scope of your variables.  So long as you are careful about defining the scope of any given variable then you are ok, if not, you could have a problem just like I did.  So, let’s start with scope and how it works.

  • Subscribe

    -->