IBM Storage: How Common Sense became the Lifeblood of Productivity


In 1983, I had just finished my study and found myself employed as a developer at an electrical cable and conductor company. Since I was in charge of the entire information system, I had to work alongside my manager to determine the best way to increase productivity and improve the organization.

While touring the warehouse, I surveyed the gigantic skeins of electrical cables that were scattered throughout the shed and loosely sorted by size and product family. Each time a customer placed an order, the workers had to hand-sort and collect the materials. Taking note of their storage processes, I designed an algorithm that reorganized the bundles of cables — not by size and family but by the frequency of withdrawal. In this way, I moved a narrow list of products to accessible locations, allowing workers to pick up the most frequently sold products quickly and with ease. After a few months, my boss saw a significant increase in warehouse performance and a major reduction in our shippers’ wait times.

This experience taught me to focus my efforts on the simple things that can have a major impact on daily processes; those little things add up quickly.

Now, more than 30 years later, as I sit here researching IBM’s architecture I can appreciate the extent to which our technological progress is tied to traditional (and rather conservative) practices. Without fail, they manage to yield positive results, even against the backdrop of our frantic digital world.

Specifically, IBM’s Easy Tier function showed me the importance of using the information we already have to increase the performance and productivity of an entire organization. Considering the ever-increasing interactions a company has with data, it would be unthinkable to put them all in a single type of storage system. Quite simply, you need to store frequently accessed data on a very fast memory, moderately accessed data on a fast enough memory, and rarely accessed data on a slow memory. This enables ultra-fast access to the data we use frequently and savings on the overall costs of storage infrastructure, as slow memory is cheap. The technical term for this strategy is “tiered storage”.

You may be wondering: How do I move frequently accessed data to lightning-fast memory, and occasionally accessed data to cheap memory every time? My short answer is: “The savings may not be worth the effort.” This is where IBM’s Easy Tier comes in handy. This function independently calculates the volume of access to a given set of data and then moves it to the appropriate memory (as shown in the following figure):

IBM Easy-Tier function for Tiered Data Storage

I was amazed that IBM had come up with a solution to not only create ultra-fast memories, but also how to equip the system with “common sense” to maintain consistently high levels of performance and avoid wasting costs.

Quite simply, the tech giant helps companies avoid spending more than necessary without compromising on speed.

This seems to sum up the magnificent data architecture that IBM offers to speed-hungry companies who understand the importance of managing data from the ground level: i.e., storage. By increasing productivity and saving costs, IBM managed to elegantly apply common sense to business management — generating value for entire organizations.

To discover more about IBM Storage and the Easy Tier function for Tiered Data Storage, you can land on this page.

Digital entrepreneur with a passion for knowledge translation | FRSA | B2B Tech Influencer | Author & Speaker | Startups Mentor | Founder & CEO

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store