Software runs my life

Category: software Page 8 of 14

Domain.com.au Improves their Search Usability

New Domain Search Form
New Domain Search Form

Domain.com.au have updated their search tool by providing a new filtering method. It involves an accordion style menu on the left hand side that lets you select filters across a number of different property parameters. Filters include the usual bedrooms, price etc. plus some new fields such as Special Features, only those with a price specified, only those with photos, properties with Open Homes this weekend and more. There are some other more subtle changes, including different coloured summary view ad titles, a “See surrounding” link, floor plans links from the summary listing, sorting by inspection time and an RSS feed of search results.

I like the improvement, and it seems the agent feedback is generally positive too. They reference the DotHomes website as an example of great usability. I agree that is is very simple to use, however I do get frustrated by a lack of consistent controls and no ability to fine tune your options straight from the home page. For me, consistency is number 1 priority, largely because I think usability is about reducing the learning curve (and that is made much easier by only having one control to learn). Additionally when you refine that control the benefits flow across the whole site, enhancing every section. All the property websites still feel that a suburb search is all you need on the front page, I am hoping to see that change in the near future.

Best Usability Mockup Tools

In my current role I am really  noticing the huge rewards delivered through extensive prototyping and usability testing. The ability to better capture and illustrate user feedback (internally and externally) as well as accelerate application development cannot be undervalued. As they say a picture tells a thousand words, but in this case a functional picture replaces a thousand words in a requirements document with ease. Requirements documents still have their place, but not as a basis for user comment or even developer guidelines. So what programs do I recommend?

Balsamic screenshot
Balsamic screenshot

The first is Balsamiq, a great little tool that you can use to replace those back of the envelope sketches at 1am in the morning. It is very rough and intended for initial prototypes only, but I find this is well suited for situations where your stakeholders can’t see the concepts for the details. I like it because it lets me see if the ideas that click beautifully in my head actually translate to something workable in real life. Above all though it is quick. Don’t expect to do full working prototypes, but you can expect to have a full Web 2.0 application roughly laid out within an hour. Once you have the concept nailed down however then it is time to move on to some other tools. Think of it as throwing a couple of A4 sheets on a table and spending an hour scribbling, without the rubber shavings and sloping misshapen tables. It is great to be able to pin it to your wall to make sure you keep focusing on the key deliverables of the application, rather than getting carried away with the details of day to day execution.

Axure Screenshot
Axure Screenshot

My favourite tool however is Axure. This tool is great because like Balsamiq it lets you build a working website really quickly, but it then lets you “colour between the lines” and flesh out an almost fully functional prototype. Out of the box Axure is a great program, with all the basic web elements you would expect. They are all easy to edit, move, layout and link. Generating HTML prototypes is also extremely easy, a one click step once you have specified an output directory.

To really unlock the power of Axure however you need to use some community resources. This top 10 Axure resources link is a great starting point. A Clean Design’s templates (number 3 in the top 10) is my personal favourite, it has almost every Web 2.0 element you can think of. The ones that are missing (i.e. Accordian, flyout menus) are covered by the official Axure design library (which is also a good example of HTML generated in Axure).

In conclusion these two tools are the staples of my usability and prototyping work. They are so powerful that one starts wondering, how long until I no longer need to send these off to a coder to develop and deploy my solution?

Can Spam Improve SEO?

Scott Savage Akismet StatisticsFor some reason I seem to get a heap of spam on my blog. Even since I first started blogging spam somehow seemed to be drawn to my blog site (and I don’t even mention V!agr4 that often!). At the time I took the screenshot to the right 8,368 spam comments and trackbacks had been caught. That is a pretty ridiculous number. Akismet has managed to catch 99.764% of these, which is a testament to it’s effectiveness (and a major reason why I use WordPress). I sometimes wonder whether maybe allowing a few of these spam comments (which usually link to link heavy pages) would actually help my search engine ranking.

I found an SEOBook article that contained a lot of interesting findings that unintentionally supported my theory. Firstly the highest risk item is that your blog will itself get tagged as spam, however “A few bad inbound links are not going to put your site over the edge to where it is algorithmically tagged as spam”. In fact you can push this even further; “If you can get a few well known trusted links you can get away with having a large number of spammy links”.

The next step is to understand what kind of links spam comments etc. provide.  Again from the article “Spammers either use a large number of low PageRank links, a few hard to get high PageRank links, or some combination of the two.”. So how do you weed out the low PageRank links and seize the high PageRank ones? Well if everyone is running the same Akismet filter (it takes resources to build a blacklist/heuristic filter, how many are there?) then perhaps the high PageRank comments are those that are missed by the most common filters?

Therefore should I leave the Akismet filter on, but approve everything that gets through it even if it is spam? Or if I wanted to be more scientific should I analyse the PageRank of each link in the spam comment and accept those with high PageRanks? Surely in these 8000+ spam comments the spammers hit gold somewhere, the question is how do I find it?

Page 8 of 14

Powered by WordPress & Theme by Anders Norén