Yuri Aguiar, Chief Information Officer, Ogilvy & Mather
Ask almost any company if they have tried their hand at Cloud based computing, and you will find that they have indeed done this through small experiments and pilot engagements. If you asked what percentage of their applications or middleware sits in this model in full production, you will see this percentage drop considerably. Though this technology is rather successful in many instances, the validation and security debate of Public and Private Cloud services for use by medium and large Enterprises will continue well into this year and the next. That’s the first trend I see.
While this powerful technology offering allows for rapid improvement with ‘time to market’ needs for new products and services, many companies are still grappling with a ‘should we or shouldn’t we’ question. There are valid reasons for some companies, especially ones with strict regulatory requirements, not getting onto Public Cloud services until there is even more maturity in the model. But for the vast majority, partnerships with the right vendors can provide significant business benefits and transparency in costs.
My expectation from vendors in this space is that they collaborate more on integration rather than build proprietary solutions. Some of the best technology companies in the business allow for an Open architecture with API’s to their application stack that work with co-existing internal platforms or a hybrid model. This could lead to Middleware as a platform or service being a significant trend for large organizations.
"In today’s times not only are we dealing with a tremendous volume of data, we are dealing with highly sophisticated customers who understand it"
Rapid Increase in Volumes of Data
Here are two examples to illustrate this phenomenon:
• Six years ago our organization moved around three Terabytes of data, per month, around our network. Today that figure is in the low double-digit range and growing rapidly.
• Another example from six years ago was that a video conference between a ‘room’ in New York and Singapore would suffice as a high quality communications experience. Last year we conducted a hidefinition test video conference from a flight over the Atlantic to Munich, Germany. Both examples would have been considered ‘unthinkable’ a few short years ago but here we are bearing down on our networks, satellites and airwaves with more digital content than ever before. This rapid increase in volumes is due to the fact that almost everything we do as a digital agency requires collaboration with other markets, security and portability of information, telecommuting capabilities and 24x7 operations. So no, nothing will ever be fine; as long as there is a newer or presumably better or more cost effective way of doing things, we will aggressively pursue it. Given this backdrop, it’s the collective, and rarely just any one thing, that keeps us alert at all times.
"The User Experience and Big Data’'
Jeffrey Moore said it best, and I paraphrase “…from Systems of Record to Systems of Engagement,” this is a formidable challenge and one in particular that I take very personally.
The focus on what we call ‘the User Experience’ has significantly improved, and given that most of the connected world will soon live entirely in the browser, this couldn’t come sooner. Browsers encourage us to become device agnostic but also force us to think of how our customers engage with our business applications on screens with various form factors, all this whilst simplifying transactions. I consider this to be a significant trend in the medium term.
In the same frame of reference, there is a lot of buzz around the overused term ‘Big Data’. This term is ultimately a brand name for what has already existed for years, and continues to grow exponentially. Culling information from these volumes of data into what can be used in a practical sense is where the real challenge lies.
Not only are we dealing with a tremendous volume of data, we are dealing with highly sophisticated customers who understand it, know exactly what they want and are used to an ‘On-Demand’ approach to everything. In this scenario, the implication on a technology group is significant. Not only do we need to have highly skilled resources and a mechanism to absorb any spikes in demand, but we also need to have burstable computing capabilities and robust Access Controls to mange security on the front end.
One trend will be to continually speed up, while simplifying, this process by putting the querying mechanisms in the hands of the intelligent consumer of the information. It would mean more complex training but surely worth the effort.
My Role as a CIO
The role has certainly changed over time and will continue to evolve. A decade ago it was about ERP implementation, data centers and networks; a few short years ago during the financial crisis it was about cost controls and shadow IT management –both extremely divergent in focus. It seems that with every major milestone achieved by CIOs, the role definition is adjusted to resolve a ‘new and improved’ crisis or issue.
Personally, I find that the connected Enterprise comes with its own challenges such as centralization, application streamlining, the adoption of social tools and a plethora of devices on which users consume critical information.
In spite of all of the above, the core of the CIO function remains the same for me, that of managing information technology to address three key areas:
1. Engage front line business leaders to ensure Productivity is front and center of the IT strategy
2. Construct a strategy that addresses Revenue and Profitability of the company
3. Maintain the highest level of Execution and Operational Efficiency
The above manifests itself in many ways from Business Intelligence to Client Engagement and seeing it through is a tall order. But doing it consistently year after year speaks volumes for the people you work with. I believe that the role of CIO gets the attention but it’s only one part of the equation-the team that supports the CIO is crucial in measuring success.