User Experience Means Everyone

Feb 5, 2010

I’ve been working on a project for an internal client, which includes linking out to various medical search utilities. One of the sites we are using as a search provider offers pharmacy searches. The site was built on ASP.Net technology, or so I would assume as all the file extensions are ‘aspx.’ I bring this provider up because I was shocked and appalled by their disregard for the users that would be searching.

This site, which shall remain unnamed, commits one of the greatest usability crimes I’ve seen: they rely only on Javascript to submit their search. In order to give you, dear reader, the scope of the issue, I always test sites like these by disabling Javascript and testing the function again.

The search stopped working.

Mind you, if this was some sort of specialized search geared toward people that were working with Javascript technology, I might be able to see requiring Javascript in order to make the search work properly. Even in circumstances like the aforementioned search, shutting down the search with Javascript disabled is still questionable.

This, however, is a search for local pharmacies.

Considering the users that might be searching for a pharmacy, we can compile a list. This is not comprehensive: the young, the elderly, the rich, the poor, sick people, healthy people, disabled people and blind people. I’ll stop there.

Let’s consider a couple of select groups in that list, i.e. the poor, the disabled and the blind. The less money you have the less likely you are to buy a new computer if your old one still works. I know this sounds funny, but I’ve seen people using Internet Explorer 5.5 to access sites in the insurance world. Lord knows what other antiques they might use to access a site. Suffice to say, people with old computers may not support the AJAX calls made by an AJAX only search.

Let’s, now, consider the two groups who are much larger than the IE 5.5 crowd: the disabled and blind. I separate these two so we can think about different situations for each.

First, the blind. Blind people use screen readers to view web sites. Though I am unsure as to the latest capabilities of screen readers, but the last time I did reading about screen readers for the blind, I was brought to understand that their experience is a little like using Lynx. See a screencap below to get an idea of what Lynx is like.

[caption id=”attachment_182” align=”alignnone” width=”300” caption=”chrisstead.net on Lynx”]chrisstead.net on Lynx[/caption]

As you can see, browsing for the blind is kind a no-frills venture. No CSS, no Javascript, no imagery. Since many of them can’t see what you have made available, (yes, there are varying degrees of blindness) they have to rely on a program to read the screen for them. This means, pages that rely on Javascript for core functionality are out of reach for these users.

In much the same way, disabled users may have a limited set of functions they can access within their browser. This will depend on the degree of disability and the breadth of function on their browser. I can’t and won’t say what a disabled browsing experience is like since I am not disabled and the experience varies so widely it’s not possible to pin down what the overall experience is like. Suffice to say, it is limited.

Now, the reason I mentioned the site was built on ASP.Net: For whatever reason, the sites I see with the worst usability almost always seem to be built on ASP.Net. I have a hard time wrapping my head around this, as I’ve built ASP/C# apps and had no problem building the core functions to operate with or without Javascript enabled. Everything you need is right at your fingertips.

From sites that require users to be on a windows machine using the newest version of Internet Explorer, to web apps that require users have Javascript and images enabled just to navigate the core functions, ASP sites often seem to be bottom of the barrel.

Perhaps it is a group of people that are used to developing for desktop apps and haven’t had to consider usability in the modern age of the web. Perhaps it’s novice developers that don’t understand some of the core concepts that go into building successful web applications. Either way, the current trend of ASP disabled-inaccessibility must come to an end.

To the ASP.Net developers of the world, I implore you to reconsider your development goals and meet the needs of your customers. To the rest of you that may be committing the same sins in another language, I beg you to be considerate of all of your users, instead of a select group. Think about usability for a degraded experience, build accordingly and make the web a better place.

Predictive User Self-Selection

Feb 4, 2010

Some sites, like this one, have a reasonably focused audience. It can become problematic, however, for corporate sites to sort out their users, and lead them to the path of enlightenment. In the worst situations, it may be a little like throwing stones into the dark, hoping to hit a matchstick. In the best, users will wander in and tell you precisely who they are.

Fortunately, users often leave hints as to who they are without knowing it. They (hopefully) travel through your site, touching certain pages and avoiding others. They also arrive from somewhere.

When trying to select your user and direct them, your initial response may be to directly ask them who they are and what they want. This works well if you are an e-tailer like Amazon, but the rest of us don’t have quite the same luxury.

If you are planning on selecting the user once they get to your site, you are already too late and they have left. You should know something about them before they every arrive. This, surprisingly, is not just about knowing your audience. It’s also what they can tell you.

Yes, your users choose to tell you something before they arrive at your site.

To learn about your users sub rosa, you need look no further than the HTTP referer. From the HTTP referer, you can find out where your user came from and, hopefully, something about what they want.

If your user accessed your site directly, you know they are either a returning visitor or they were recommended by someone. If they arrived at a landing page, it is, undoubtedly, due to some marketing effort. This, however, says the least about the user.

If they came from another website and not a search engine, you’ll know they were reading another site related to yours. Perhaps this is a competitor. Perhaps this is a colleague. Either way, you know they have come to your site because they are interested in the link they clicked through to.

The most directly informative method a user can use to access your site is through search engines. You can gather, immediately, that your user was interested in something directly related to your site. You also know your user is actively seeking something. Finally, your user told the search engine what they wanted and, subsequently, they told you.

How?

The HTTP referer. The referer is passed with every GET request and tells the server about where you have been. Don’t get ahead of yourself. The referer only informs you of the last place your user was. In our current case, a search engine.

The big four search engines, AKA Google, Yahoo, AOL and Bing (MSN), all use get arguments to store information about the current search. This makes it easy to clip the information you want, like a coupon, from the HTTP referer string.

Note: The next bit of this discussion involves a little code. If you’re not comfortable with programming in the popular language PHP or this just isn’t your job, copy and paste the following into an e-mail and send it to your developer.

In PHP, you can collect user referer information with the following line of code:

$referer = $_SERVER["HTTP_REFERER"];
```
Once you have the referer stored, you can test it against the big four. One way of doing this might be like the following:
$refererType = "other";
$searchEngines = array("aol", "bing", "google", "yahoo");
foreach($seachEngine as $value){
     if(preg_match("/$value/i", $referer) !== false){
          $refererType = $value;
          break;
     }
}
```
Now we've got the search engine they used and a referer string. This is prime time for extracting their search query and figuring out who your user really is. Hopefully this won't turn into a Scooby Doo episode, where Old Man Withers is haunting your site. Let's do some data extraction:
$queryKey = ($refererType == "yahoo") ? 'p' : 'q';
$pattern = "/(\\?|\\&)$queryKey\\=/";
$argStart = preg_match($pattern, $referer) + 3;
$argLen = preg_match("/(\\&|\\$)/",
     $referer, $argStart) - $argStart;
$arg = urldecode(substr($referer, $argStart, $argLen));
$argArray = explode('+', $arg);
```
Still with me? The code part is over. Now that you have the search information and it's been broken into happy, bite-sized chunks, you can use it to do all kinds of fun things with your user. You can check their spelling, to ensure they are in the right place. You can offer special links that relate to what they searched for. You can even adjust the page to better suit the user's needs. The possibilities are limitless. By knowing a little about what the user did before they arrived at your site, you can direct their journey through your site. It is in your hands to find interesting and creative uses for this information. Go and make the web a better place.

Mapping the Course: XML Sitemaps

Feb 2, 2010

I just read a short, relatively old blog post by David Naylor regarding why he believes XML sitemaps are bad. People involved with SEO probably know and recognize the name. I know I did. I have to disagree with his premise, but agree with his argument.

I say XML sitemaps are good!

The real issue with XML sitemaps does not lay in the technology but the use. If a site is well designed, well developed and has a strong information architecture, it should spider well and indexing should occur. Moreover, if the HTML/XHTML supporting the information on the site is well formed, the site should get decent rankings. This is where I agree with David.

I disagree on the grounds that there is nothing XML sitemaps do that other SEO best practices won’t do. There is one clear item on this docket. Update frequency. There is no better tool I know of for announcing update frequency than an XML sitemap.

Within the standard for sitemap generation, update frequency can be denoted. By setting the update frequency appropriately, the spider will have an indicator to how often it should visit. This is really important in assuring that a spider will revisit your site and pick up new pages regularly, especially for new sites. Established sites may not suffer from the same kind of crawl frequency, but even there, it is good practice to make things as easy for the search spider as possible.

While we are on the topic of unique features, XML sitemaps also offer an opportunity to reinforce your navigation hierarchy. Page priority can be specified, giving the search engine an early indicator to what will be found in the site. Search spiders dislike hunting through site links to discern information for which they could, otherwise, have a pre-set baseline.

It should be stated that XML sitemaps, when held isolated, do not cure all ills. They are merely one more tool in the locker that allows a site to grow and benefit from good search ranking. When sitemaps are coupled with strong content, good code, description tags, thoughtful information architecture and careful navigation, they can only be a boon to your site.

When used correctly, an XML sitemap will drive search spiders to key pages faster and ensure early indexing of the entire site. Once this initial indexing is managed, it is up to the people who maintain the site to ensure that the path is clear to access information across the site.

Ultimately, David does not argue against sitemaps but, instead, chooses an easy target like poor navigation and concludes that because there are users that don’t use a sitemap properly, the technology must be bad. This is disappointing as it seems to lead potential SEO professionals astray. XML sitemaps are your friends and when treated with the kindness and care a friend deserves, they will only be a boon to your site. Build your site well, use sitemaps intelligently and make the web a better place.

The Browser Clipping Point

Feb 1, 2010

Today, at the time of this writing, Google posted a blog stating they were dropping support for old browsers. They stated:

The web has evolved in the last ten years, from simple text pages to rich, interactive applications including video and voice. Unfortunately, very old browsers cannot run many of these new features effectively.

I made a case to move in the same direction at my company less than a month ago. I reviewed the visitor statistics and discovered less than 10% of all visitors to our sites use Internet Explorer. Months ago, Digg posted a blog asking whether they should block Internet Explorer 6 from viewing the site. Their statistics represented similar numbers to our own.

This would be a fairly radical move as blocking someone from viewing a site seems like a fairly aggressive move on Digg’s part. My proposal was much more relaxed and forgiving. I proposed that I upgrade Internet Explorer on my computer and stop supporting version 6. This doesn’t mean I plan to block people from the site if they haven’t upgraded, it just means I’ve consciously deprecated their choice.

A while back, I posted a blog about browser wars and how people were behaving on the web. I would never condone a conscious exclusion of one visitor or another simply to support my favorite browser. This is unfair and, moreover, can alienate the user in a way that will discourage people from ever returning to my site, even if they opted for my preferred browser.

I am certain someone is asking why 10% is a good threshold for clipping browser support. I assure you, the number is arbitrary. Some people may want to choose a higher or lower number, depending on what their audience needs. Regardless of the particular number, the important thing is the direction the percentage is headed.

When Firefox first hit the market, to say it wasn’t interesting as a browser because it didn’t have a large enough market share would have been perceived as foolish. Firefox use was on the rise, so catering to the users would have been in the best interest of all involved.

Internet Explorer 6 use is on the decline and the dropoff is getting steeper. As users buy new computers and upgrade their software, IE6 gets wiped out. Moreover, Microsoft started campaigning years ago for users to upgrade to a newer version of Internet Explorer.

Something of note, Internet Explorer has been around for almost a decade now. As technology moves forward, IE6 only becomes more obsolete. One of the easiest examples to point to is the support for alpha-transparency PNGs. IE6 renders PNGs with alpha transparency with a blue background. Unless your site is already the particular shade of blue which is rendered, the transparent graphics are going to look cludgy and out of place on your site.

Other items of note, which are important to developers more than users, are things like new Javascript technologies and updated CSS specifications. As these technologies improve and grow, IE 6 will continue to to seem more and more obsolete, much like how IE5.5 appeared after IE6 hit the market.

To be fair, there are other old browsers which also fall down when pushed to render web sites using new technologies. The difference is, new browsers have built-in functions to test for updates. IE6 is old enough that Microsoft didn’t think to build that kind of functionality. They relied on users going to the Microsoft website and upgrading by hand.

In the end, we have reached a breaking point. Old browsers which are no longer supported, even by the company that built them, will eventually need to be clipped from the support regime that so many companies and individuals adhere to. Instead of blocking them, however, try the gentler approach of simply forgetting about them and letting them fade into the past. Be kind to your users, give them a gentle nudge to update and upgrade. Never push them off the cliff. Be aware of the browsers you support and make the web a better place.

Creativity Kills

Jan 29, 2010

People are creative. It’s a fact of the state of humanity. People want to make things. It’s built into the human condition. But there is a difference between haphazard creation and focused, goal-oriented development.

Andy Rutledge states that creativity is not design. I agree with him. Creativity alone does not solve problems. Creativity, when allowed free reign, is as much a destructive force in business projects as it could be a productive partner.

Creativity can be a great driver for new ideas, but when creative focus remains the primary focus, the end product is bound to suffer. Web sites can prove a noteworthy breeding ground for creative direction overriding good problem solving.

I will avoid mentioning any sites that I have found which are better macaroni and finger-paint project than they are solutions to existing problems. I tend to agree with Steve Krug that it is hard to make a really effective site and easy to botch the job.

Andy Rutledge has already said quite enough about creativity versus visual design that I don’t feel I should elaborate any further. There are plenty of other aspects of a project that people get slick and tricky with.

Visual elements within a design can be a killer when you have someone that wants to expend lots of time being creative. Regardless of the fact that the visual elements on a site can be referred to as artwork, it is not art like they have in your local coffee shop.

Before I started working at my current company, there was a designer that was interested in photography. When working on a particular marketing folder, he spent weeks creating a set, cutting out styrofoam letters and shapes. He set up backdrops and lights. Eventually he came away with just the right shot. Truth be told, it looked so perfect I swore he rendered it in a 3D imaging program.

The problem is he was more focused on being creative than working on business needs, The folder he created looked nice but it was far too costly in time and salary. Creativity can be a major expense on a project for only a small improvement, if there is improvement at all.

Even if your designer stays sharp and focused, other issues can arise. Copywriters can get creative, which can be as detrimental to the message as any overwrought design. A good copywriter will stay focused on company goals, speak in simple language and cut straight through the goals of the business.

Bad copy can take a good design and tell your user that the company looks great on the outside, but suffers from a lack of direction on the inside. Creative copy can be painful. Often, a creative writer is going to show their love of the language so they will use too much of it. Focus and simplicity is key.

The last problem I am going to bring up, though there are a large number of other issues that arise out of overly creative thinking, is creative development.

Development involves anything from site hierarchy and architecture to coding and various other elements which make the site recognizable as an interactive information machine.

When an information architect or user experience designer/developer allows creativity get in the way of focusing on the user, the results can be disastrous. The site flow will suffer and navigation will become obvious to the user.

Obvious site navigation and structure is painful. The user notices because they find themselves frustrated and lost. Lost, frustrated users leave, never to return.

Finally, the engineering development which goes into the site must be clear. Problems must be solved in a clean, thoughtful way, but creativity cannot drive this aspect of the project.

One of the most detrimental things to a project life cycle is a creative engineer. Engineers that are being creative first and solving problems second are engineers looking to add unnecessary features.

The cliche of the feature-happy engineer comes from a creative engineer. Good engineering requires a smart, creative problem solver. The key is solving a problem. If an engineer is allowed to create a solution that lacks a problem, the engineer is guaranteed to derail your project as fast as you can imagine.

In the end, business solutions require a clever, focused team. Creativity should be harnessed and directed toward solving existing problems. When creativity is allowed to run rampant in a business environment, the results can be damaging to the user experience and the business image. Go forth, solve problems and make the web a better place.

  • Web Designers Rejoice: There is Still Room

    I’m taking a brief detour and talking about something other than user tolerance and action on your site. I read a couple of articles, which you’ve probably seen yourself, and felt a deep need to say something. Smashing Magazine published Does The Future Of The Internet Have Room For Web Designers? and the rebuttal, I Want To Be A Web Designer When I Grow Up, but something was missing.

  • Anticipating User Action

    Congrats, you’ve made it to the third part of my math-type exploration of anticipated user behavior on the web. Just a refresher, the last couple of posts were about user tolerance and anticipating falloff/satisficing These posts may have been a little dense and really math-heavy, but it’s been worth it, right?

  • Anticipating User Falloff

    As we discussed last week, users have a predictable tolerance for wait times through waiting for page loading and information seeking behaviors. The value you get when you calculate expected user tolerance can be useful by itself, but it would be better if you could actually predict the rough numbers of users who will fall off early and late in the wait/seek process.

  • User Frustration Tolerance on the Web

    I have been working for quite a while to devise a method for assessing web sites and the ability to provide two things. First, I want to assess the ability for a user to perform an action they want to perform. Second I want to assess the ability for the user to complete a business goal while completing their own goals.

  • Google Geocoding with CakePHP

    Google has some pretty neat toys for developers and CakePHP is a pretty friendly framework to quickly build applications on which is well supported. That said, when I went looking for a Google geocoding component, I was a little surprised to discover that nobody had created one to do the hand-shakey business between a CakePHP application and Google.

  • Small Inconveniences Matter

    Last night I was working on integrating oAuth consumers into Noisophile. This is the first time I had done something like this so I was reading all of the material I could to get the best idea for what I was about to do. I came across a blog post about oAuth and one particular way of managing the information passed back from Twitter and the like.

  • Know Thy Customer

    I’ve been tasked with an interesting problem: encourage the Creative department to migrate away from their current project tracking tool and into Jira. For those of you unfamiliar with Jira, it is a bug tracking tool with a bunch of toys and goodies built in to help keep track of everything from hours to subversion check-in number. From a developer’s point of view, there are more neat things than you could shake a stick at. From an outsider’s perspective, it is a big, complicated and confusing system with more secrets and challenges than one could ever imagine.

  • When SEO Goes Bad

    My last post was about finding a healthy balance between client- and server-side technology. My friend sent me a link to an article about SEO and Google’s “reasonable surfer” patent. Though the information regarding Google’s methods for identifying and appropriately assessing useful links on a site was interesting, I am quite concerned about what the SEO crowd was encouraging because of this new revelation.

  • Balance is Everything

    Earlier this year I discussed progressive enhancement, and proposed that a web site should perform the core functions without any frills. Last night I had a discussion with a friend, regarding this very same topic. It came to light that it wasn’t clear where the boundaries should be drawn. Interaction needs to be a blend of server- and client-side technologies.

  • Coding Transparency: Development from Design Comps

    Since I am an engineer first and a designer second in my job, more often than not the designs you see came from someone else’s comp. Being that I am a designer second, it means that I know just enough about design to be dangerous but not enough to be really effective over the long run.

  • Usabilibloat or Websites Gone Wild

    It’s always great when you have the opportunity to built a site from the ground up. You have opportunities to design things right the first time, and set standards in place for future users, designers and developers alike. These are the good times.

  • Thinking in Pieces: Modularity and Problem Solving

    I am big on modularity. There are lots of problems on the web to fix and modularity applies to many of them. A couple of posts ago I talked about content and that it is all built on or made of objects. The benefits from working with objectified content is the ease of updating and the breadth and depth of content that can be added to the site.

  • Almost Pretty: URL Rewriting and Guessability

    Through all of the usability, navigation, design, various user-related laws and a healthy handful of information and hierarchical tricks and skills, something that continues to elude designers and developers is pretty URLs. Mind you, SEO experts would balk at the idea that companies don’t think about using pretty URLs in order to drive search engine placement. There is something else to consider in the meanwhile:

  • Content: It's All About Objects

    When I wrote my first post about object-oriented content, I was thinking in a rather small scope. I said to myself, “I need content I can place where I need it, but I can edit once and update everything at the same time.” The answer seemed painfully clear: I need objects.

  • It's a Fidelity Thing: Stakeholders and Wireframes

    This morning I read a post about wireframes and when they are appropriate. Though I agree, audience is important, it is equally important to hand the correct items to the audience at the right times. This doesn’t mean you shouldn’t create wireframes.

  • Developing for Delivery: Separating UI from Business

    With the advent of Ruby on Rails (RoR or Rails) as well as many of the PHP frameworks available, MVC has become a regular buzzword. Everyone claims they work in an MVC fashion though, much like Agile development, it comes in various flavors and strengths.

  • I Didn't Expect THAT to Happen

    How many times have you been on a website and said those very words? You click on a menu item, expecting to have content appear in much the same way everything else did. Then, BANG you get fifteen new browser windows and a host of chirping, talking and other disastrous actions.

  • Degrading Behavior: Graceful Integration

    There has been a lot of talk about graceful degradation. In the end it can become a lot of lip service. Often people talk a good talk, but when the site hits the web, let’s just say it isn’t too pretty.

  • Website Overhaul 12-Step Program

    Suppose you’ve been tasked with overhauling your company website. This has been the source of dread and panic for creative and engineering teams the world over.

  • Pretend that they're Users

    Working closely with the Creative team, as I do, I have the unique opportunity to consider user experience through the life of the project. More than many engineers, I work directly with the user. Developing wireframes, considering information architecture and user experience development all fall within my purview.

  • User Experience Means Everyone

    I’ve been working on a project for an internal client, which includes linking out to various medical search utilities. One of the sites we are using as a search provider offers pharmacy searches. The site was built on ASP.Net technology, or so I would assume as all the file extensions are ‘aspx.’ I bring this provider up because I was shocked and appalled by their disregard for the users that would be searching.

  • Predictive User Self-Selection

    Some sites, like this one, have a reasonably focused audience. It can become problematic, however, for corporate sites to sort out their users, and lead them to the path of enlightenment. In the worst situations, it may be a little like throwing stones into the dark, hoping to hit a matchstick. In the best, users will wander in and tell you precisely who they are.

  • Mapping the Course: XML Sitemaps

    I just read a short, relatively old blog post by David Naylor regarding why he believes XML sitemaps are bad. People involved with SEO probably know and recognize the name. I know I did. I have to disagree with his premise, but agree with his argument.

  • The Browser Clipping Point

    Today, at the time of this writing, Google posted a blog stating they were dropping support for old browsers. They stated:

  • Creativity Kills

    People are creative. It’s a fact of the state of humanity. People want to make things. It’s built into the human condition. But there is a difference between haphazard creation and focused, goal-oriented development.

  • Reactionary Navigation: The Sins of the Broad and Shallow

    When given a task of making search terms and frequetly visited pages more accessible to users, the uninitiated fire and fall back. They leave in their wake, broad, shallow sites with menus and navigtion which look more like weeds than an organized system. Ultimately , these navigation schemes fail to do the one thing they were intended for, enhance findability.

  • OOC: Object Oriented Content

    Most content on the web is managed at the page level. Though I cannot say that all systems behave in one specific way, I do know that each system I’ve used behaves precisely like this. Content management systems assume that every new piece of content which is created is going to, ultimately, have a page that is dedicated to that piece of content. Ultimately all content is going to live autonomously on a page. Content, much like web pages, is not an island.

  • Party in the Front, Business in the Back

    Nothing like a nod to the reverse mullet to start a post out right. As I started making notes on a post about findability, something occurred to me. Though it should seem obvious, truly separating presentation from business logic is key in ensuring usability and ease of maintenance. Several benefits can be gained with the separation of business and presentation logic including wiring for a strong site architecture, solid, clear HTML with minimal outside code interfering and the ability to integrate a smart, smooth user experience without concern of breaking the business logic that drives it.

  • The Selection Correction

    User self selection is a mess. Let’s get that out in the open first and foremost. As soon as you ask the user questions about themselves directly, your plan has failed. User self selection, at best, is a mess of splash pages and strange buttons. The web has become a smarter place where designers and developers should be able to glean the information they need about the user without asking the user directly.

  • Ah, Simplicity

    Every time I wander the web I seem to find it more complicated than the last time I left it.  Considering this happens on a daily basis, the complexity appears to be growing monotonically.  It has been shown again and again that the attention span of people on the web is extremely short.  A good example of this is a post on Reputation Defender about the click-through rate on their search results.

  • It's Called SEO and You Should Try Some

    It’s been a while since I last posted, but this bears note. Search engine optimization, commonly called SEO, is all about getting search engines to notice you and people to come to your site. The important thing about good SEO is that it will do more than simply get eyes on your site, but it will get the RIGHT eyes on your site. People typically misunderstand the value of optimizing their site or they think that it will radically alter the layout, message or other core elements they hold dear.

  • Information and the state of the web

    I only post here occasionally and it has crossed my mind that I might almost be wise to just create a separate blog on my web server.  I have these thoughts and then I realize that I don’t have time to muck with that when I have good blog content to post, or perhaps it is simply laziness.  Either way, I only post when something strikes me.

  • Browser Wars

    It’s been a while since I have posted. I know. For those of you that are checking out this blog for the first time, welcome. For those of you who have read my posts before, welcome back. We’re not here to talk about the regularity (or lack thereof) that I post with. What we are here to talk about is supporting or not supporting browsers. So first, what inspired me to write this? Well… this:

  • Web Scripting and you

    If there is one thing that I feel can be best learned from programming for the internet it’s modularity.  Programmers preach modularity through encapsulation and design models but ultimately sometimes it’s really easy to just throw in a hacky fix and be done with the whole mess.  Welcome to the “I need this fix last week” school of code updating.  Honestly, that kind of thing happens to the best of us.

  • Occam's Razor

    I have a particular project that I work on every so often. It’s actually kind of a meta-project as I have to maintain a web-based project queue and management system, so it is a project for the sake of projects. Spiffy eh? Anyway, I haven’t had this thing break in a while which either means that I did such a nice, robust job of coding the darn thing that it is unbreakable (sure it is) or more likely, nobody has pushed this thing to the breaking point. Given enough time and enough monkeys. All of that aside, every so often, my boss comes up with new things that she would like the system to do, and I have to build them in. Fortunately, I built it in such a way that most everything just kind of “plugs in” not so much that I have an API and whatnot, but rather, I can simply build out a module and then just run an include and use it. Neat, isn’t it?

  • Inflexible XML data structures

    Happy new year! Going into the start of the new year, I have a project that has carried over from the moment I started my current job. I am working on the information architecture and interaction design of a web-based insurance tool. Something that I have run into recently is a document structure that was developed using XML containers. This, in and of itself, is not an issue. XML is a wonderful tool for dividing information up in a useful way. The problem lies in how the system is implemented. This, my friends, is where I ran into trouble with a particular detail in this project. Call it the proverbial bump in the road.

  • Accessibility and graceful degradation

    Something that I have learnt over time is how to make your site accessible for people that don’t have your perfect 20/20 vision, are working from a limited environment or just generally have old browsing capabilities. Believe it or not, people that visit my web sites still use old computers with old copies of Windows. Personally, I have made the Linux switch everywhere I can. That being said, I spend a certain amount of time surfing the web using Lynx. This is not due to the fact that I don’t have a GUI in Linux. I do. And I use firefox for my usual needs, but Lynx has a certain special place in my heart. It is in a class of browser that sees the web in much the same way that a screen reader does. For example, all of those really neat iframes that you use for dynamic content? Yeah, those come up as “iframe.” Totally unreadable. Totally unreachable. Iframe is an example of web technology that is web-inaccessible. Translate this as bad news.

  • Less is less, more is more. You do the math.

    By this I don’t mean that you should fill every pixel on the screen with text, information and blinking, distracting graphics. What I really mean is that you should give yourself more time to accomplish what you are looking to do on the web. Sure, your reaction to this is going to be “duh, of course you should spend time thinking about what you are going to do online. All good jobs take time.” I say, oh young one, are you actually spending time where it needs to be spent? I suspect you aren’t.

  • Note to self, scope is important.

    Being that this was an issue just last evening, I thought I would share something that I have encountered when writing Javascript scripts.  First of all, let me state that Javascript syntax is extremely forgiving.  You can do all kinds of  unorthodox declarations of variables as well as use variables in all kinds of strange ways.  You can take a variable, store a string in it, then a number, then an object and then back again.  Weakly typed would be the gaming phrase.  The one thing that I would like to note, as it was my big issue last evening, is scope of your variables.  So long as you are careful about defining the scope of any given variable then you are ok, if not, you could have a problem just like I did.  So, let’s start with scope and how it works.

  • Subscribe

    -->