Backing up on DVDs and CDs? Think Again

You are an extremely lucky person if you have never lost anything from your computer. As every day goes by more and more of our lives are held on hardware that at any time could fail without notice; photographs, movies, journals, business records, website files, databases, designs, contacts, emails and more.

This might sound like the introduction to a new backup solution but the truth is this is more like a self motivating post. Despite losing photographs, work and many gigabytes of MP3 I still don’t have a reliable backup solution. It is forever on my todo list and never completed, as there is always something better to be doing.

Until my inevitable post describing the loss of my entire digital life, I just wanted to talk quickly about using optical discs (DVDs and CDs especially) for backup; don’t do it.

Most likely you are using more sophisticated methods such as an external hard drive, but if you have any old backups on disc, get them somewhere safe. Why? Simply because optical discs degrade over time.

The lifespan of a DVDr could be anything from 2 years to 30 years depending on the quality, storage, temperature and exposure to light. As anyone who has burned discs before will know, even discs in the same spindle are of varying quality; some work and some don’t.

Without the need for research, debate, analysis or testing it is safe to say you shouldn’t use CDs or DVDs for backup if you have anything but short term intentions. With the amount of alternatives available, it is best just to get out of this habit and look at something else.

A quick overview of alternative methods:

  • Cloud Storage – store your data online at places like Amazon S3 or Dreamhost Files Forever (my recommendation – let someone else worry about the hardware).
  • Dropbox – Great automatic solution, requires no maintenance, but limited in size
  • External Hard Drives – Applications like Time Machine on the Mac make this easy, but still could fail.
  • Crashplan – Looked like a good solution. Similar to dropbox but with more space and scope. When I tried it took too long to upload and drained bandwidth.

Whatever backup system you use make sure it is feasible, automatic and reliable. Most importantly, just make sure you have one! (note to self…)

 

Suggestions, criticisms or opinions? Get involved and comment below!

Online Banking Usability and The Dreaded Card Reader


When a little card reader called PINSentry arrived for one of my accounts with Barclays, I was initially a bit curious of the technology but more importantly I was frustrated. Could I use my bank on the move? What if I am in Starbucks and I forget it? Where am I going to keep it on my desk? (yes, this is a concern for me!)

For those of you that are not aware, these card readers look like a calculator with a slot for your card at the top. You enter your card, enter your pin, enter a code from the website and receive a code to type back into the website.

A number of years ago, while working for an upcoming warehouse software company, I came across a somewhat paranoid but fascinating solution one of our clients used to protect their network. We often had to access databases remotely and in this case we did so via a VPN (a VPN is basically connecting to a private network over the internet).

The added twist was that we had to enter an extra code. This code was shown on a little keychain dongle that had been sent us – while I don’t remember the name or details, the dongle gave us the password and it was different every time. We had excited conversations between ourselves on how this thing worked and how the algorithm could possibly be cracked. Soon enough we realised that the whole idea was a huge burden – many people needed to access the VPN from many different places and we only had one.

We toyed with the idea of setting up a webcam and broadcasting the readout via a webpage, undermining the security. Fortunately the client scrapped this security.

I write this because this is exactly the frustration we all face with bank card readers. Currenty I am in Asia (Koh Chang, Thailand to be precise) and I am carrying no less than three of these readers, one of which I had to have shipped from the UK after they replaced it.

The Barclays PINSentry Card Reader. I H8 U.

Why else are these readers a burden?

  • They prevent multiple users accessing the account simultaneously (think business accounts and shared accounts)
  • They require your card, so two items need to be carried at all times. Some people never use their cards and have no reason to carry them
  • Card readers can break
  • Card readers need batteries
  • They are not convenient to carry

It is no doubt a big priority that these systems remain secure. Ultimately, that security is irrelevant if the customers do not use the service or transfer to banks that have a more user friendly security method. So, in my opinion, number one priority for bank and customer is the usability.

Unfortunately, the card readers are just the icing on the cake for me. Some of the other frustrations I find with online banking:

  • Non standard login methods, preventing your browser or plugin (LastPass!) from working
  • Incomprehensible design and technology decisions, causing major usability problems. (e.g. browser back/forward buttons causing logging out)
  • Lack of stored data (most of my accounts only hold a couple of months statements)
  • Lack of browser/device compatibility (no chance of banking online with my phone)
  • Lack of reliable notifications for payments (did they receive it? what is the status?)

The Light

Fortunately, I have seen recently a couple of examples of great forward thinking in the online banking arena. Of particular note is Barclaycards excellent new online interface, launched around July 2009. As the screenshot below shows, you can see up to date graphs on your spending categorised by groceries/fashion/travel etc.

Barclaycard Online Interface - Click to Enlarge

As Barclaycard have pioneered, what else can we look into for our online banking solutions?

  • Notifications – RSS, Text, Email and/or desktop messages showing transactions, balance, charges, statement available
  • Phone integrations – an iPhone app for my banks, with push notification, would be immense
  • Integrations – Achievable via notifications for a programmer, but some kind of integration with invoicing software or personal finance systems would be a big time saver.
  • Better exports – ability to export all information, from all date ranges
  • Better use of information – Imagine the information that must be available on each transaction? Location, company details, exact time, balance at that moment etc.
  • Standardisation across banks, allowing the ability to view finances together (perhaps only realistically achievable by integrations)

Existing Integrations

I intend to investigate further, although it seems that existing websites bringing finances together in a truly automated way are still in infancy, probably due to our banking system here in the UK. It is only last year that the banks upgraded their systems to allow instant money transfers (instead of 3 to 5 days delay) – from a reliable rumour I heard, this was due to some banks using the equivalent of spreadsheets to organise these transactions.

Some of the sites that are worth investigating (thank you to Emma Davies of LoveMoney for her contributions here):
Mint.com – currently seems to be USA only
Money Dashboard – looks slightly amatuer, although claims to integrate automatically. Try with caution, I saw them spamming on money forums. Still in Beta with no launch date.
LoveMoney.com – A new UK only company launched in April 2009, with online banking launched in December last year. Constantly improving with updates every two weeks. Check out the Love Money Blog.
Kublax – a 2007 seedcamp winner, but faces closure due to lack of funding. Could be saved by Simply Finance so still worth keeping in mind.
Wehuhu – No integrations yet (manual uploads) but this is a new service and could be expanding soon.

With some digging around, there appears to be a resistance from the UK banks which is delaying these types of systems. Quote regarding Mint.com “They’ve said they aren’t going to launch a UK version for the forseeable future. None of the major UK banks have gotten on board to allow sharing of transaction data.”. Source: Money Saving Expert forums.

Imagine the Future

I am an optimist. I picture a time when I wake up in the morning, check my emails and see that I have received three payments with details of who from and the exact date/time they sent it. This summary also shows that of the five payments I sent yesterday night, three have been received successfully and two are still pending. My invoicing software is notified of the payments and marks the relevant invoices as paid.

Sat in my favourite coffee shop, I check out my iPhone banking app. I can quickly see the balance of all my accounts, credit cards (including available spend) and also that the two pending payments are now confirmed.

That evening I travel out of my home town and pay for petrol on my card in somewhere I have never been before. I instantly receive a text message to notify me of this transaction, due to its slightly out of character nature, with a web link and number to report the fraud and instantly freeze the card if necessary.

It is is the end of the month and I am checking my credit card statements. I can see easily what I have spent compared to the past 6 months, by category of expenditure. I can see a graph showing my expenditure over the month and realise that the first week I went a bit overboard on clothes shopping. My account shows that all bills are scheduled to be paid and calculates that there is enough money to pay them all, giving me a total of “free cash” that I can withdraw during the month.

This is just a sample of how much control we could have and how convenient banking could be. How nice would it be to see a cheap Macbook Air in the shop, check your finances instantly on your phone and only buy if you can? Not very nice for the banks it appears, which may be a reason they are dragging their heels when it comes to providing us with convenient information.

Why Banks should Embrace

More control over our finances should theoretically mean less mistakes, less people overdrawn, less interest and less fines; all this equals less profit for the banks.

The reality is that not everyone will embrace these new features. Offering this technology does not instantly make everyone in the country good with money – those who are too busy, too scared and otherwise not motivated will still make the mistakes they always have. The ones who are craving this power will reward the banks with more business.

Advancing the online banking technology will not put more money in peoples pockets, remove their greed or fix their lack of money skills.

Let me indulge you with an example. I always play it safe – if I have any doubt over how much money I have, I won’t buy. I am not tricked by overdrafts, high credit card limits or buy now pay laters. I like to think I know about money and I don’t like paying interest – I always pay in full and on time.

If I am given more control and information, I am likely to know exactly what I can afford and spend more. As a bonus, if I am told that buying a new espresso machine this month will only cost me £7.24 in interest if I put it on card and pay it the month after, I will be tempted to do so.

There is opportunity here for financial institutions to get a head start on others. I am not afraid to change to a bank that offers me more information, better access, more convenience and is more forward thinking. As banks know, an existing customer is more likely to take a loan or a mortgage or other products.

Better tools to access information and monitor information means people will be using the banks systems more. More usage means more potential for advertising, up selling and gaining customers trust.

If my bank embraced technology even slightly, I would respect them more and listen to what they have to say.

Solutions to Security

There still lies obstacles with security, put in place by self-righteous technology consultants and the paranoid media. Due to this mindset, usability is often completely ignored.

I admit I am not particularly security focused, but I believe finding a solution that is convenient to the user is essential. Regardless, here are some potential security starting points:

  • Instead of card readers, how about additional security only when unusual activity occurs? (e.g. different location of login)
  • Grant lower security to those who want it. Give us a choice! I will take the risk, because I know I won’t enter my details into www.barclaysbank.somerandomdomain.com/login
  • Custom security levels. e.g. by default require additional authentication only when money is transferred (customisable by the user)
  • Bio-metrics – some way off for mainstream (who of you has a fingerprint reader?), but should be implemented when feasible
  • Trusted machines – link my laptop or desktop to the website, meaning I only have to jump through a hoop once

What do you think?

Are you happy with the banking system? Are you inconvenienced by the security, or do you not mind and prefer the peace of mind? Do you have any alternatives or ideas?

If so, comment below!

Importing Large MySQL Database SQL Dump Files

If you have ever tried to transfer a large MySQL database you will no doubt have come across some issues. With phpMyAdmin being the most popular interface most webhosts provide it (and only this) as the way for you to interact with your databases.

Unfortunately, import an SQL dump containing your data and table structure often causes timeouts via phpMyAdmin when the database is particularly large meaning you have to import data a table at a time (or even part of a table at a time) which takes a long time and is prone to error.

In the past I have done many things from writing custom import scripts to importing the database bit by bit. I have also used desktop tools (such as the excellent Sequel Pro for Mac OS X) although many web hosts disallow external access.

Fortunately I came across an excellent, flexible import script called BigDump which takes your SQL file, splits it and automatically imports it (via javascript reloads). Script and full instructions are available at http://www.ozerov.de/bigdump.php

Follow me on Facebook for more tips and tricks!

Do you have any tips, tricks, advice or questions about importing/exporting MySQL databases? Get involved and comment below!

CAPTCHA – Passing the Problem to the User

CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart) is a method of preventing spam commonly used in web forms. The user is displayed a distorted image of numbers, letters or words and asked to type them out. This ensures that the form is not filled out automatically by a bot and reduces the spam sent through that form.

It is clear that this solution goes a long way to dealing with the spam problem and the webmaster is happy, but at what cost to the user? At best there is another field to fill out. At worst, if the image is hard to read or the user has some visual impairment it is a barrier for the user to completing the form.

Most good CAPTCHA implementations will include a visual (image with text) and audible (audio playback of the text) meaning that even those struggling to read the image will be able to complete the form. There are also usually options to “refresh” the image, creating a potentially easier to read image.

All of this helps but does not avoid the fact that the user is having to do more work to complete the form. We are always looking to increase conversions and should be looking at reducing the actions a user has to do (such as avoiding duplicate email and password fields) in order to increase the chance they will fill out the form.

So what about alternatives? A great example of spam detection is the excellent Akismet as popularised by WordPress. Spam is automatically detected by algorithms and tests on the Akismet server and this reduces manual checking to pretty much zero as well as leaving the user experience untouched.

So a use of Akismet or similar service is the ideal solution – the user does not see a change and the spam problem is still solved. The service should err on the side of caution, to prevent losing genuine information. Regular checks of the “spam” content should also be considered.

The second best solution, which can also be used in conjunction with an Akismet type integration, would be to improve your admin systems. If your form is submitting directly to email then it will be difficult to check and remove large amounts of spam. Having a tidy admin area with the ability to view/edit/delete data all together will reduce time spent on checking for spam.

Many CAPTCHA implementations are unnecessary and are done as either habit or due to some idea that it is a professional thing to do – question whether you will actually receive spam and is it better to deal with it manually?

The final alternative would be to make the CAPTCHA less obtrusive, more fun or easier in some creative way. Microsoft have the initiative ASIRRA which shows promise – the user is asked to identify pictures of dogs and cats. A simple click by the user is all that is needed and in tests many found the exercise fun.

Some websites also offer simple random questions (e.g. what is 3 plus 5?) although these are potentially circumvented as well as lacking the fun appeal to most users.

In conculsion I stress that you should consider first and foremost whether you need spam protection. If you need to, try to ensure the spam protection does not intrude on the users experience – your conversion rates will reward you.

Do you have any CAPTCHA tips, tricks, advice or questions? Get involved and comment below!

Accessing Websites Without a Subdomain

World Wide Web
Creative Commons License photo credit: Bull3t

The www prefix is essentially a subdomain of your website and although many people will type it in, some may just try to enter the domain itself.

Many websites, including high profile ones, can’t be accessed without the www subdomain. For example, barclaycardunwind.co.uk is inaccessible while www.barclaycardunwind.co.uk is.

This is essentially a DNS or server configuration problem, so a single solution is not available for everyone but it should be straightforward. Contact your web host if you are not sure or modify your server configuration if you know what you need to do.

Once you can access your site without the www, make sure that the traffic is forwarded to one or the other to prevent potential duplicate content issues with google.

Make sure you do it, or risk frustrating and/or losing a lot of your visitors!

Understanding Google Gears – The Basics

Understand Google Gears

Google Gears was released in mid 2007 and, as described by the big G themselves, “enables more powerful web applications, by adding new features to your web browser.”. This is a slightly generic explanation and the aim of this post is to show exactly what Google Gears does and how you can benefit from it.

Essentially Gears (the official name, to reflect the open source nature) extends your web browser. You perform an install at http://gears.google.com/ which is available for most modern browsers. A website can then use the following features via Gears:

  • A local server, to cache and serve application resources (HTML, JavaScript, images, etc.) without needing to contact a server
  • A database, to store and access data from within the browser
  • A worker thread pool, to make web applications more responsive by performing expensive operations in the background

But what does this actually mean and what makes it good?

Offline Functionality – By using Gears, the application and data can be stored offline and the user can perform tasks without an internet connection. Data can then be synchronised once an internet connection is available. A great example of this is Google Mail – with gears you can view your emails, reply, send new, access your address book etc. and everything is performed once you get an internet connection. See also the todo list application Remember the Milk.

Faster Sites – Gears can store files, data and code on a local web server, meaning your machine shares the processing power and reduces the need for data transfer. WordPress implements this well in the admin area, speeding up many tasks.

Quicker Searches – Full text searching can be performed locally, meaning less server resources are used and quicker results for the user. MySpace takes advantage of this feature.

Threading JavaScript Code – On a more technical level, the WorkerPool functionality allows Javascript to be run in the background without affecting the browser. This essentially means that script execution is not stopped on the website while this Javascript is running, meaning a more user friendly experience and hopefully the disappearance of the “A script is taking a long time to run…” dialog boxes.

GeoLocation access – Google Gears can request the geographic location from the client (if there is a GPS device such as on an iPhone) or from the network information (IP address etc.). This is made simple for the coder and allows the web application to know with accuracy and efficiency where the client is located.

What else can Gears do?

Gears is still in Beta and is not recommended for public use yet (despite many Google Applications using the technology). Gears provides an interface between the website and the clients machine, meaning that in future websites will for example be able to:

  • Encode video/audio locally
  • Send notifications to the users desktop (e.g. Windows XP bubbles or OS X Growl style notifications)
  • Access files locally
  • Ability to resume file uploads
  • Use the clients camera

Some important information

Google Gears needs to be implemented so this is an extra strain on lower budget websites and applications. Also, the client needs to install an extension to their browser meaning your typical user won’t have the software available. This means that at the moment only hugely popular technical/professional style sites are implementing gears.

The popular Firefox plugin Grease Monkey, which basically allows you to use scripts (or your own code) to modify websites on the fly, can be used to implement Gears features even when the website hasn’t implemented it. See the Google article Gears Monkey for more information and a script for Wikipedia.

The actual name of this technology is Gears. Google renamed it from Google Gears in order to reflect the open source direction of this project. Many people still refer to the technology as Google Gears, including myself, due to the ambiguity of Gears.

What other web applications are currently using Google Gears?

Here are a few I know about:

Do you have any Google Gears tips, tricks, advice or questions? Get involved and comment below!

.htaccess Redirect from Non-WWW to WWW

If you can access your website both with www and without then you are effectively serving duplicate content. I am sure Google are sensible enough to filter this out but some people feel like it could be an issue so my recommendation is better safe than sorry.

Place the following code in your .htaccess file to automatically force www. on the front of your URLs:


RewriteEngine On
RewriteCond %{HTTP_HOST} !^www\.
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]

You might not need the “RewriteEngine On” – check if it already exists in your .htaccess

There are a lot of other solutions using .htaccess but this particular solution does not need editing (i.e. your URL is not in the code) so you can use it in a distributed project or many websites without having to customise it.

Why I Use Coda for Web Development

Coda by Panic - Web Development Software
Coda by Panic - Web Development Software
Coda by Panic Software is actually quite an important piece of software for me. About 2 or 3 years ago I was an avid PC user surrounded by Macs but after seeing this bit of software I finally decided to switch.

I had thought about switching earlier but it was the slickness and features of Coda that persuaded me (along with Ellen Feiss of course):

Slick Interface
It is often underestimated but a well thought out user interface is really important when you are using the software daily. Menus are well designed, buttons are uncluttered and on a whole it is intuitive. This helps with motivation and encourages you to try out new features.

Integrated CSS
A single click will take you from code view to a comprehensive CSS GUI. This is perfect for when you can’t remember the syntax or want to pick a colour.

Designed for Experts
Thankfully Coda seems to have left out all the hand holding other development environments like Dreamweaver are cluttered with. You go straight into code view and your extra WYSIWYG options are only available if you seek them out.

Well Organised Sites
The way coda stores and displays your sites is ingenius. The first screen you see is filled with screenshots of your saved websites and double clicking logs into FTP and shows your files ready for working. This means you can get started on a site in seconds.

Command Line Integration
Get into the command line in a couple of clicks within the Coda environment. Login details can be saved for each site individually.

Integrated Books
HTML, CSS, Javascript and PHP reference are included by default with other sources you can add manually. This means you can quickly look up code from within the environment.

SVN Integration
While I don’t need this anymore the integration with Subversion is well thought out and a great time saver

Work Live via FTP
The ability to work live on your files is fantastic for me. While this can be risky (I have never had a problem however) it allows you to make changes super quickly and for me speeds things up massively. If you don’t like to work live Coda will track the files you have changed locally and upload them with one click, which is almost as quick.

Search Across Files
Although this isn’t quite as strong as Eclipse yet you can still search through open files and local files quite efficiently.

Plugins
An open interface allows anyone to create plugins for the software, meaning some great addons are available like CSS Tools (compress and tidy your CSS) and PHP Validator.

Wildcards in Search and Replace
Grep or Regular Expressions don’t come naturally to me so this search and replace function is a big time saver. You can basically put a wildcard in the “Search” and “Replace” box to switch things around – it is hard to explain but easy to use.

Clips
Save little snippets of code, font families for css, standard comment headers, basic html structure or anything that will save you time and you can retrieve them with a couple of clicks.

This is just scratching the surface of what Coda can do but they are my most important features (when compared to other software). Some other cool things include the ability to collaborate on a document (live over the internet) and preview your pages.

I would love to hear what you like about Coda and maybe discover some new features – simply comment below.

Tag Clouds and Tagging – Does it Work?

Obama Baracks speach in Berlin treated with wordle
Creative Commons License photo credit: karstenkneese

“Tag cloud”

This phrase was once so sexy. Instantly you can picture a block of random words with varying sizes and/or colour in a random order. Now it seems to have fallen out of favour – since the initial conception, nobody is talking about how to improve or advance this technology.

A Tag Cloud was once talked about as a perfect example of Web 2.0 – user generated content, funky design and alternative UI navigation. Many sites quickly jumped to using Tag Clouds and even now most new sites that can use them will.

What caused the hype with Tag Clouds? They ticked all the right buttons:

  • User Generated – The idea that your website visitors will tag everything and save you time in organising content.
  • Alternative UI – Finding better ways to access content is a good thing
  • Modern Design – As a new but widespread idea a Tag Cloud gives visual appeal and association with high profile sites
  • User Interaction – Allowing user contribution improves community and repeat visits

For me a Tag Cloud has been a paradox. When discussed in planning a new site they are always a good idea but in practice they often fall down or cause problems. The initial designs benefit from them but in practice they are rarely clicked and often ignored by visitors. The prospect of self-organising content is tempting but often more administration is created.

Why is this? From my own experience as a user, tags are just too fuzzy. I often waste more time trying to think of the right tags and as such I am most likely not to tag them. I always turn off tags in WordPress installs and have never taken to delicious properly because the tagging seems like hard work to me.

Essentially there is too much freedom and ambiguity. Do you tag an SEO tool website as “SEO”, “Tool”, “SEOTool”, “SEO Tool” or all of the above? Do you use singular or plural? If your searching for content, can you guarantee it has been tagged with the right keywords or could you be missing some things?

Here is a breakdown of my issues with Tag Clouds:

  • User Generated – Your users can and will do what they like. While you will get some great users who tag perfectly others will make spelling mistakes and not care. Others still might bicker over pointless details such as the user of captilisation or singular/plural. Users will use a mix of different ideas resulting in a total lack of cohesion.
  • User Interface – While the Tag Cloud looks good to an experienced internet user, to others it is not self explanatory. More importantly even experienced users will ignore a tag cloud for the reasons above (basically inaccuracy). The Tag Cloud relies on the idea of “casual browsing” or in other words a user who doesn’t know what they want to see next. In reality we  mostly browse with a purpose or goal in mind with little room for suggestions of what to look at and therefore little room for a tag cloud.
  • Modern Design – While initially it might have been beneficial to be associated with the Tag Cloud crowd, it no longer stands out and your users will be blind to it. Even if your website implements tags in a flawless and useful way, the association with bad implementations on other websites will ruin your hard work.
  • User Interaction – If your users don’t understand the need for tags and you make it an integral part of the system (like delicious) then you risk making your service seem like hard work. For me tags don’t flow easily and therefore it seems like hard work to bookmark a site with delicious.

To illustrate my point, here are a couple of examples of how Tag Clouds are frustrating to me. These two examples show how this applies to group tag clouds (i.e. where many people can tag the same thing) and individual tag clouds (i.e. where the content owner/publisher only can tag).

Flickr – The original tag cloud user, Flickr allows their users to tag photographs with whatever they want. The result? Ambiguity (search for cheese and you get a dog). Overkill (tagged with many variations e.g. “samui”, “koh samui”, “kohsamui”, “thailand”, “chaweng” etc. etc.). Innacuracy (wrong spelling tags). Spam (Using a “Michael Jackson” tag incorrectly to get visitors).

Delicious – This bookmarking tool tags in a different way by allowing all users to tag the same website. The main problem is the variety of tags that are used. Take the excellent Penny Arcade web comic – it has a vast amount of different tags including the ambiguous “fun”, “art” and “monday”. In reality this offers no value for navigation and creates a headache for when you want to tag the site (oh should I tag it “fun” or just “comic”? or maybe “webcomic”? Or how about “gaming web comic”, or is that too specific?). The result is simply a mess of disorganisation.

Unfortunately for all the promise I cannot see a benefit to using Tag Clouds in this way. I can understand tagging content in a controlled and structured environment, but on the web in community websites it would be impossible to organise and manage.

Once you start using Tag Clouds it is hard to go back. Users will be used to them (some will of course like them) and your navigational structure will no doubt be based around them.

So in reality your Tag Cloud could actually be doing your website some harm. Don’t simply follow the cloud crowd – think about if this concept will work and know for sure before it is too late.

Magento eCommerce – Bloated or Brilliant?

code bug
Creative Commons License photo credit: gui.tavares

When I first heard about Magento eCommerce I got excited. Very excited.

For years I had been using a modified Zen Cart codebase which worked well, was easily hackable but constantly frustrated me with bad system design, poor templating, low standards compliance and a bickering development community. It was stuck in the past but I made it work with my own modifications over time.

Magento appeared on the scene at about the time everyone was quibbling over what Web 2.0 means. It was a time of shiny new applications, short & brandable urls, gradients and buttons. At first it seemed too good to be true; an MVC approach (using Zend framework), latest technology, well thought out features, upgrading platform, company backed and developers not afraid to use the latest versions of PHP and MySQL.

I installed an early Beta, played with it a little, loved the features but I didn’t give the codebase much thought. It was strange. A lot of files, a lot of directories, a completely OOP approach (while I find OOP useful, I had never seen it used so strictly and extensively) and an unusual use for XML. I accepted this as a learning curve and decided to use it on a commercial project when the opportunity arose.

I must stop here and make clear that I don’t intend to cruelly pull apart any particular methodology, application design or any other development decisions. My beliefs are that these kinds of decisions depend on a variety of factors including:

  • The skill set of developers working on and with the system
  • Technological constraints (server technology etc.)
  • Intended use of the system (private, open source, licensed etc.)
  • Lifespan of the system
  • Type of application

Therefore I am saying it is not a case of “which methodology/programming language/framework etc. is best” but more a case of “what is this project and what technology will suit it best”. As I will explore below I believe Magento made decisions based on what would be cutting edge, fresh and idealist instead of looking at what would suit a new open source eCommerce system.

So back to my story and I had found an opportunity to use this new system. A project requiring product filtering, high numbers of orders, one page checkout (which Magento does beautifully) and many addons that would require a lot of hacking in Zen Cart. I took the plunge and went with Magento.

Not long after getting started on the project I realised that the application design didn’t “fit” with me. I have worked with a lot of custom & open source web applications, database systems, retail software, warehouse management software and I always managed to find an understanding with them. No matter if the code resembled long, thin, cylindrical pasta of Italian origin or if it was written in a foreign language I managed to get my head around it and work with it comfortably.

With Magento this just did not happen. I could not understand what decision process resulted in thousands of directories and tens of thousands of files (a lot of which contain empty class declarations). I couldn’t figure out the pattern of where code went and what the abstract folders called “Convert”, “Entity”, “Layer”, “Resource” etc. are for. The naming convention for folder names, model names etc. didn’t sink in – sometimes you needed lower case, sometimes camel case and sometimes one capital letter.

Ultimately, I look at the results. It took me 5 times longer to perform any task in the code (template changes, adding features, debugging etc.) and even magento specific consultants agree with this. Documentation was sketchy at the time but I don’t believe this matters. Apart from quick references I have never read a manual, watched a video or anything else when trying to figure out a new system. I follow the code, look at the file structure and experiment.

Due to the Zend/OOP/MVC influence on Magento it is impossible to follow the code. Classes are referenced dynamically, various aspects are contained in XML files and there is no clear flow that you can just debug through. The sheer volume of files and folders makes finding something unbelievably tedious.

Even the database is a minefield. In every other system I have used finding data is easy. In Magento, the use of EAV means that data is split amongst hundreds of abstract tables. Again, it doesn’t flow and it doesn’t make sense without a great deal of time developing a solid understanding of what they have done.

All of this and more ate away at me until I began to resent working on the project. I concede that this might be isolated to a minority like me, but I do know I am not the only one to have problems. Different freelancers I employed to do custom work didn’t follow conventions and hacked into the core (causing upgrade problems, which will be discussed later). I regularly see forum threads discussing the same issues and blog posts expressing their concerns.

The common argument I see are along the lines of “the web is evolving, applications are getting more complex” and  “for more features we need more structure and this was necessary with Magento”. I even saw one elitist claim that a system like Magento is “proper PHP development” whereas systems like WordPress are for “PHP Hackers”.

Essentially I disagree.

Yes, Magento is an advanced system and I am sure once the rules are learned and understood these application design decisions make sense. Unfortunately I do not believe the time needed to understand (and continue to understand) is worth the potential advantages.

As I touched on before, application design depends on the facts. Here is my take on the facts applying to Magento and the community that will use the system (or any modern open source eCommerce system):

  1. Potential developers will have a wide variety of different skills
  2. Potential developers will have varying ability and experience
  3. Potential developers will have different opinions on coding standards, application design, templating etc.
  4. Profitability (e.g. support, hosting, addons) requires a good uptake and community
  5. Hosting technology will vary massively
  6. Open source developers will want to contribute code and participate in a community
  7. Due to the free nature, many people will want to quickly try the application
  8. Free software does not benefit from the same attention span as paid software
  9. Maintain reputation of security and progression

On looking at the factors involved, I believe these are good aims for the project as a whole:

  1. Code set must be easily picked up by developers from varying backgrounds
  2. Code base must be intuitive for “Newbies”
  3. Coding standards must not be elitist or provoke time wasting debates and indecision
  4. Must be focused on good uptake (easy to install, get started with etc.) to allow commercial opportunities
  5. Must be forgiving to different hosting configurations
  6. Easily allow sharing and developing addons. Addon system must be logical and easy to understand.
  7. Must allow quick install and trying out of the application
  8. Must not require excessive reading of guides to get started
  9. Allow easy updates to ensure security and new features are rolled out. Discourage “hacking” the code.

From my experience most of these points are affected in some way by the need to use this complicated approach to code. When the rules are too abstract hacking occurs. When the code is hacked upgrades are a big job. When upgrades are difficult, upgrades do not happen and new features/security fixes aren’t implemented.

Hosting is also a notorious problem for Magento; fussy configuration means many users complain of speed issues. The only hosts running Magento well are the ones with good knowledge of the system – this is an unusual scenario for a hosting company.

From my experience, working with Magento requires a lot of learning and re-learning. This just isn’t practical for your average web developer simply because of time & budget constraints. It has nothing to do with skill level, ability to learn or unwillingness to try new methods. It is about pragmatism over perfectionism.

By being pragmatic I mean finding an optimal solution to the problem without doing the unnecessary. I have been down the perfectionist path and it did not work for me. I now work to goals instead, to much more success. Ultimately what is better; achieving your goals within budget or using the best technology with the latest programming techniques and methodologies?

What would have been a good approach to Magento? In my opinion, the development hours put in to a solution like Magento could have been better spent finding a good structure that is easily understood by the majority of developers. The structure might not be perfect, it might not follow academic methodologies and it will probably draw criticisms from purists.

But ultimately if it achieves your goals, it is a success.

A great example of this is WordPress. I have often thought the code base is dodgy by professional standards, but it works. I have grown to love it because I know if there is a problem I can find the section of code easily. The structure makes sense to me and I never feel the need to hack the core. The amount of plugins avaialable is a testament to the success of the system, as is the number of websites using WordPress and the successful commercial arm.

WordPress is achieving its goals.

A final Cynical Note… Many people claim that the complexity of Magento is somewhat intentional. The profitability of Magento relys on consulting, technical support and installations. Making the codebase complex could mean that many developers will start out, get stuck and pay for help. If this is the intention of Varien then perhaps they have been very goal focused, although I doubt this plan would have long term appeal. Personally I believe that Magento is simply the result of focusing too much on cutting edge methodologies and not enough on what they actually want to achieve with the system.

Comments, thoughts, opinions or complaints? Leave a comment below…