Tag Archives: Project management

The rules of enterprise IT

As I’ve pointed out before (possibly as I’m quite fond of games{{1}}) the game of enterprise IT has a long an proud history. I’ve also pointed out that the rules of this game need to change if enterprise IT — as we know it — is to remain relevant in the future{{2}}. This is triggered a few interesting conversations at the pub on just what are the old rules of IT.

[[1]]Capitalise: A game for the whole company to play![[1]]
[[2]]People don’t like change. (Or do they?)[[2]]

Enterprise IT, as we know it today, is an asset management business, the bastard son of Henry Ford’s moving production line. Enterprise IT takes the raw material of business processes and technology and turns them into automated solutions. From those first card tabulators through to today’s enterprise applications, the focus has been on delivering large IT solutions into the business.

The rules of enterprise IT are the therefore rules of business operations. After a fair amount of coffee and beer with friends, the following 4 ± 2 rules seems to be a fair minimum set (in no particular order).

Keep the lights on. Or, put more gently, the ticket to the strategy table is a smooth running business. Business has become totally reliant on IT, while at the same time IT is still seen as something of a black art run by a collection of unapproachable high priests. The board might complain about the cost and pain of an ERP upgrade, but they know they have to find the money if they want to successfully close the books at the end of the financial year. While this means that the money will usually be found, it also means that the number one rule of being a CIO is to keep the transactions flowing. Orders must be taken, products shipped (or services provided), invoices sent and cash collected. IT is an operational essential, and any CIO who can’t be trusted to keep the lights on won’t even have time to warm up their seat.

Save money. IT started as a cost saving exercise: automatic tabulation machines to replace rooms full of people shuffling papers, networks to eliminate the need to truck paper from one place to another. From those first few systems through to today’s modern enterprise solutions, applications have been seen as a tool to save time and money. Understand what the business processes or problem is, and then support the heavy information lifting with technology to drive cost savings and reduce cycle time. Business cases are driven by ideas like ROI, capturing these savings over time. Keep pushing the bottom line down. These incremental savings can add up to significant changes, such as Dell’s make-to-order solution{{3}} which enabled the company to operate with negative working capital (ie. they took your cash before they needed to pay their suppliers), but the overall approach is still based on using IT to drive cost savings through the automation of predefined business processes.

[[3]]Dell’s make to order solution leaves competitors in the dust.[[3]]

Build what you need. When applications are rare, then building them is an engineering challenge. You can’t just go to the store and by the parts you need, you need to create a lot of the parts yourself in your own machine shop. I remember the large teams (compared to today) from the start of my career. A CORBA project didn’t just need a team to implement the business logic, it needed a large infrastructure team (security guy, transaction guy …) as well. Many organisations (and their strong desire to build – or at least heavily customise – solutions) still work under this assumption. IT was the department to marshal large engineering teams who deliver the industrial grade solutions which can form the backbone of a business.

Ferrero Rocher
Crunch on the outside, soft and chewy in the middle.

Keep the outside outside. It’s common to have what is called a Ferrero Rocher{{4}} approach to IT: crunchy on the outside while soft and chewy in the middle. This applies to both security and data management. We visualise a strong distinction between inside and outside the enterprise. Inside we have our data, processes and people. Outside is everyone else (including our customers and partners). We harvest data from our operations and inject it into business intelligence solutions to create insight (and drive operational savings). We trust whatever’s inside our four walls, while deploying significant security measures to keep the evil outside.

[[4]]Ferrero[[4]]

It’s a separate question of whether or not these rules are still relevant in an age when business cycles are measured in weeks rather than years, and SaaS and cloud computing are emerging as the dominate modes of software delivery.

Decisions are more important than data

Names and categories are important. Just look at the challenges faced by the archeology community as DNA evidence forces history to be rewritten when it breaks old understandings, changing how we think and feel in the process. Just who invaded who? Or was related to who?

We have the same problem with (enterprise) technology; how we think about the building blocks of the IT estate has a strong influence on how approach the problems we need to solve. Unfortunately our current taxonomy has a very functional basis, rooted as it is in the original challenge of creating the major IT assets we have today. This is a problem, as it’s preventing us to taking full advantage of the technologies available to us. If we want to move forward, creating solutions that will thrive in a post GFC world, then we need to think about enterprise IT in a different way.

Enterprise applications – the applications we often know and love (or hate) – fall into a few distinct types. A taxonomy, if you will. This taxonomy has a very functional basis, founded as it is on the challenge of delivering high performance and stable solutions into difficult operational environments. Categories tend to be focused on the technical role a group of assets have in the overall IT estate. We might quibble over the precise number of categories and their makeup, but for the purposes of this argument I’m going to go with three distinct categories (plus another one).

SABER
SABER @ American Airlines

First, there’s the applications responsible for data storage and coherence: the electronic filing cabinets that replaced rooms full of clerks and accountants back in the day. From the first computerised general ledger through to CRM, their business case is a simple one of automating paper shuffling. Put the data in on place and making access quick and easy; like SABER did, which I’ve mentioned before.

Next, are the data transformation tools. Applications which take a bunch of inputs and generate an answer. This might be a plan (production plan, staffing roster, transport planning or supply chain movements …) or a figure (price, tax, overnight interest calculation). State might be stored somewhere else, but these solutions still need some some serious computing power to cope with hugh bursts in demand.

Third is data presentation: taking corporate information and presenting in some form that humans can consume (though looking at my latest phone bill, there’s no attempt to make the data easy to consume). This might be billing or invoicing engines, application specific GUIs, or even portals.

We can also typically add one more category – data integration – though this is mainly the domain of data warehouses. Solutions that pull together data from multiple sources to create a summary view. This category of solutions wouldn’t exist aside from the fact that our operational, data management solutions, can’t cope with an additional reporting load. This is also the category for all those XLS spreadsheets that spread through business like a virus, as high integration costs or more important projects prevent us from supporting user requests.

A long time ago we’d bake all these layers into the one solution. SABER, I’m sure, did a bit of everything, though its main focus was data management. Client-server changed things a bit by breaking user interface from back-end data management, and then portals took this a step further. Planning tools (and other data transformation tools) started as modules in larger applications, eventually popping out as stand alone solutions when they grew large enough (and complex enough) to justify their own delivery effort. Now we have separate solutions in each of these categories, and a major integration problem.

This categorisation creates a number of problems for me. First and foremost is the disconnection between what business has become, and what technology is trying to be. Back in the day when “computer” referred to someone sitting at a desk computing ballistics tables, we organised data processing in much the same way that Henry Ford organised his production line. Our current approach to technology is simply the latest step in the automation of this production line.

Computers in the past
Computers in the past

Quite a bit has changed since then. We’ve reconfigured out businesses, we’re reconfiguring our IT departments, and we need to reconfigure our approach to IT. Business today is really a network of actors who collaborate to make decisions, with most (if not all) of the heavy data lifting done by technology. Retail chains are trying to reduce the transaction load on their team working the tills so that they can focus on customer relationships. The focus in supply chains to on ensuring that your network of exception managers can work together to effectively manage disruptions in the supply chain. Even head office focused on understanding and responding to market changes, rather than trying to optimise the business in an unchanging market.

The moving parts of business have changed. Henry Ford focused on mass: the challenge of scaling manufacturing processes to get cost down. We’re moved well beyond mass, through velocity, to focus on agility. A modern business is a collection of actors collaborating and making decisions, not a set of statically defined processes backed by technology assets. Trying to force modern business practices into yesterdays IT taxonomy is the source of one of the disconnects between business and IT that we complain so much about.

There’s no finer example of this than Sales and Operations Planning (S&OP). What should be a collaborative and fluid process – forward planning among a network of stakeholders – has been shoehorned into a traditional n-tier, database driven, enterprise solution. While an S&OP solution can provided significant cost saving, many companies find it too hard to fit themselves into the solution. It’s not surprising that S&OP has a reputation for being difficult to deploy and use, with many planners preferring to work around the system than with it.

I’ve been toying with a new taxonomy for a little while now, one that tries to reflect the decision, actor and collaboration centric nature of modern business. Rather than fit the people to the factory, which was the approach during the industrial revolution, the idea is to fit the factory to the people, which is the approach we use today post LEAN and flexible manufacturing. While it’s a work in progress, it still provides a good starting point for discussions on how we might use technology to support business in the new normal.

In no particular order…

Fusion solutions blend data and process to create a clear and coherent environment to support specific roles and decisions. The idea is to provide the right data and process, at the right time, in a format that is easy to consume and use, to drive the best possible decisions. This might involve blending internal data with externally sourced data (potentially scraped from a competitor’s web site); whatever data required. Providing a clear and consistent knowledge work environment, rather than the siloed and portaled environment we have today, will improve productivity (more time on work that matters, and less time on busy work) and efficiency (fewer mistakes).

Next, decisioning solutions automate key decisions in the enterprise. These decisions might range from mortgage approvals through office work, such as logistics exception management, to supporting knowledge workers workers in the field. We also need to acknowledge that decisions are often decision making processes which require logic (roles) applied over a number of discrete steps (processes). This should not be seen as replacing knowledge workers, as a more productive approach is to view decision automation as a way of amplifying our users talents.

While we have a lot of information, some information will need to be manufactured ourselves. This might range from simple charts generated from tabular data, through to logistics plans or maintenance scheduling, or even payroll.

Information and process access provide stakeholders (both people and organisations) with access to our corporate services. This is not your traditional portal to web based GUI, as the focus will be on providing stakeholders with access wherever and whenever they need, on whatever device they happen to be using. This would mean embedding your content into a Facebook app, rather than investing in a strategic portal infrastructure project. Or it might involve developing a payment gateway.

Finally we have asset management, responsible for managing your data as a corporate asset. This looks beyond the traditional storage and consistency requires for existing enterprise applications to include the political dimension, accessibility (I can get at my data whenever and wherever I want to) and stability (earthquakes, disaster recovery and the like).

It’s interesting to consider the sort of strategy a company might use around each of these categories. Manufacturing solutions – such as crew scheduling – are very transactional. Old data out, new data in. This makes them easily outsourced, or run as a bureau service. Asset management solutions map very well to SaaS: commoditized, simple and cost effective. Access solutions are similar to asset management.

Fusion and decisioning solutions are interesting. The complete solution is difficult to outsource. For many fusion solutions, the data and process set presented to knowledge workers will be unique and will change frequently, while decisioning solutions contain decisions which can represent our competitive advantage. On the other hand, it’s the intellectual content in these solutions, and not the platform, which makes them special. We could sell our platform to our competitors, or even use a commonly available SaaS platform, and still retain our competitive advantage, as the advantage is in the content, while our barrier to competition is the effort required to recreate the content.

This set of categories seems to map better to where we’re going with enterprise IT at the moment. Consider the S&OP solution I mention before. Rather than construct a large, traditional, data-centric enterprise application and change our work practices to suit, we break the problem into a number of mid-sized components and focus on driving the right decisions: fusion, decisioning, manufacturing, access, and asset management. Our solution strategy becomes more nuanced, as our goal is to blend components from each category to provide planners with the right information at the right time to enable them to make the best possible decision.

After all, when the focus is on business agility, and when we’re drowning in a see of information, decisions are more important than data.