ORACLE: Swinging the Bat in Cloud Services

It's hard to believe that an entire month has gone by since Oracle OpenWorld in San Francisco — but baseball fans will have noticed that things have been a bit hectic here in Chicago of late.  Ironically, the Chicago Cubs clinched a playoffs berth on September 18th, the very day that Larry Ellison officially opened OPEN World with his first of several Keynote presentations.

My primary motivation for attending OpenWorld was to get an update on Oracle's two banking platforms — the new flagship Oracle Banking Platform (OBP) aimed at large retail banks and its stable mate FlexCube, the universal banking platform deployed by nearly 600 banks globally.  After three days at OpenWorld, I realized that the real story of interest to banks is Oracle's emerging cloud story, which coupled with its existing core banking applications business puts them in a really interesting position to transform the core banking systems market.

Now a robust 72-year-old, Larry joked with the audience about being prohibited from climbing the stairs to the Keynote stage, but he still exhibits the intense competitive burn that served his company well during the ERP battles of the 1980s and 1990s.  The difference is that these days, he's less focused on IBM or SAP as he is two new challengers:  Amazon Web Services (AWS) on the IT infrastructure side and WorkDay on the applications side.

Of course, what AWS and WorkDay have in common are that they are both businesses with long-term prospects predicated on the continued growth and development of cloud services.  Larry's first Keynote noted that we are witnessing generational change as companies move from "lots of individual data centers" to a smaller number of "super-data centers called Clouds".  In a separate presentation, Oracle CEO Mark Hurd shared Oracle's view that within the next ten years, 80% of corporate-owned data centers will have been closed, with the remaining data centers running at 20% of today's capacity, running legacy workloads that are not easily ported to a cloud services environment (hey COBOL — I think they're talking about you!).


According to Larry, Oracle's "overnight success" in cloud services began ten years ago when it started reengineering its original ERP products — licensed software designed primarily for the on-premise market — into a new multi-client, multi-tenant architecture, as befitting a company that was pivoting to the emerging SaaS model.  At OpenWorld, Larry shared how Oracle was extending its original SaaS business to embrace both the Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) models — putting AWS, Google, and Microsoft all squarely in Oracle's competitive sights.

Of the three, AWS appears to be the primary target of Oracle's competitive ambitions:  Oracle's new "Gen2" IaaS platform offers a virtual machine (VM) that offers twice as many cores, twice as much memory, four times as much storage, and more than ten times the I/O capacity of a comparable AWS VM.  But there is a catch, according to Larry:  "you have to be willing to pay less" than what AWS charges for a comparable VM.  (While AWS might take issue with Larry's claims about performance and value, what is clear is that Oracle is planning a serious competitive challenge to AWS's supremacy in the IaaS race.)

In Larry's words, "Amazon's lead is over.  Amazon's gonna' have some serious competition going forward."


This represents great news for the growing number of banks that have gotten past the question "Why cloud?" and have moved onto the more interesting question "How cloud?".  For the largest banks like Capital One that have significant IT development capabilities in-house and with the willingness to experiment with new technologies, AWS makes a lot of sense.  Banks can roll their own code, spool up a VM, and off they go.  Capital One's cloud journey has been so compelling in fact, that the bank is in the process of closing down 5 of its 8 data centers while swinging many workloads to AWS.  Most banks, however, are not Capital One. 

For the mere mortals among us — banks coping with practical limitations on their ability to develop and host banking apps — having an IT partner with demonstrable experience on the application side and the infrastructure side can represent a real game changer in terms of the bank's appetite to make a leap into cloud-based banking services.  While some banks have in the past been a bit overwhelmed by Oracle's ambitious sales pitch featuring its all singing, all dancing suite of integrated applications ("that's very impressive, but I only want a G/L!"), with Oracle's new IaaS offering a bank could mix and match Oracle applications (offered via SaaS) with third-party and in-house developed systems.  That potentially a game changer for banks interested in cloud services, but overwhelmed by the complexity of going it alone with a public cloud provider.

That brings us back to OBP and FlexCube.  OBP is a recently built Java-based core banking solution built for the needs of large scale retail banks.  As such, OBP is aimed squarely at mainframe-based core platforms like Hogan, Celeriti and Systematics.  FlexCube is a universal banking platform and has also seen some renewed investment from Oracle in the last few years.  While today its primary market appears to be international banks, as a modular solution FlexCube can address specialized needs in the US market like cash management and trade finance. OBP and FlexCube continue to compete in the global core banking systems market on their own terms — with OBP having some recent success as a foundation for a new digital banking platform for Key Bank (Cleveland) and as the foundation of a complete core replacement project at National Australian Bank (Melbourne).

For larger banks intrigued by the promise of cloud services, but daunted by the complexity of building and operating their own environment, the opportunity to pull down an OBP license that is hosted in the Oracle Cloud while dragging other applications from their private data center to Oracle's IaaS platform could help achieve in one move the twin goals of core banking system and data center transformation.  That's a rare 2-for-1 in a world where a widely-held truism is that every IT decision involves trade-offs among alternatives.

According to Larry, Oracle's annual revenue run rate for cloud is currently about $4 billion.  Amazon recently announced that its revenue run rate for cloud services was north of $10 billion while just yesterday Microsoft announced its own revenue run rate for cloud has approached $13 billion.  These are undoubtedly different businesses (AWS is more or less pure-play IaaS while Microsoft skews towards SaaS by virtue of the strength of its Office 365 business), so I won't pretend to make any apples-to-apples comparisons.  The point remains that while it's still early in the cloud ballgame for Oracle — with deep financial resources, an impressive portfolio of banking applications, and Larry's intense "Will To Win" against the current market incumbents — bank CIOs need to pay close attention to what is going on in Redwood Shores.

Two Hallmarks of Successful Branch Transformation Initiatives

Since my coverage areas include branch and ATM channel technologies, I often get asked, “What distinguishes successful branch channel transformation initiatives?”

Questions like this cut to the chase. Spare me all the charts & graphs, Bob, just tell me what successful institutions are doing. Fair enough. But, don’t we all want easy answers? How many diets are out there being promoted? They all sound pretty easy. If only…

But, I got to thinking… There are at least two hallmarks of successful branch transformation initiatives, despite there being a diversity of approaches and outcomes. Here goes:

1. Two, Not One

Except for the smallest of community banks, branch channel transformation involves two, concurrent initiatives – the current network and the future network. Why’s that?

Most banks appear to associate branch channel transformation with radical changes in the branch operating model. Arguably, for many banks, radical changes are needed. At the same time, very few North American financial institutions appear to have a clear vision of what they’d like to build. The “branch of the future” is not yet in focus. This is understandable given the cacophony of vendor voices urging banks to adopt a growing variety of physical designs, automation approaches, and paths to superior customer engagement. Banks should, in Celent’s opinion, embark on an ambitious branch of the future project with deliberate caution and methodical rigor. Proceeding in this manner — even with swift internal decision-making — will take several years. And implementation is rarely a “big bang.” Instead, new designs are rolled out over time, taking years to reach their eventual maximum impact.

The problem with this approach is two-fold. First, it tends to justify inaction until a clear future branch vision is embraced. After all, how can one begin a journey unless the destination is clear? The second problem is more significant — it confuses developing a future branch design vision with preparing the existing branch infrastructure for those new designs. For example, physical design is clearly a new branch design element. By contrast, underlying software platform choices and how new loans and deposit accounts get originated can impact both current and future branch designs.


I’ve spoken with too many banks who, for example, postpone a teller image capture initiative on legacy branches until their “future branch” design is finalized. Most institutions are under pressure for short-term results. Most branch transformation efforts won’t produce a near-term ROI. Two projects are needed – one on the current network and another focused on the future network – with close coordination between the two. Something like this:


2. Lead with Human Capital, Not Technology

The second hallmark has to do with when human capital plans are implemented – prior to, coincident with, or following future branch initiatives. Strongly-held opinions abound. What appears to resonate broadly is this: branch interactions are becoming more about sales/service and less about transactions. This invites new, more highly-trained roles with a different skill mix.

The prevailing argument for positioning human capital strategy at the tail end of the journey is typically cost-focused. No one wants to pay the price to recruit, train and compensate Universal Bankers – only to spend much of their day playing the teller role.

The prevailing argument for leading with human capital is user experience-focused. In the final analysis, what differentiates a branch experience from the constantly improving digital experience, if not face-to-face engagement? Leading with human capital may indeed be a more costly experiment. But every financial institution I’ve interviewed who did so are glad they did. Conversely, every institution I’ve interviewed who didn’t (many of them) wishes they had.

“Transforming the Landscape” – My learnings from SIBOS 2016

The fall conference season is a business time for us in the industry research business. I’ve finally recovered from a hectic week in Geneva, where I met with over 40 banks, technology companies, and consulting firms to discuss what’s happening in global transaction banking. This year’s Sibos theme was “Transforming the Landscape”, organized around four themes: Banking, Compliance, Culture, and Securities. A selection of Sibos session recordings is available on the Sibos website.

With my research focus of Corporate Banking, my discussions focused on three key topics.

  • SWIFT’s global payments innovation (gpi) initiative:  SWIFT announced that it had successfully completed the first phase of the gpi pilot, surprising some bankers with SWIFT’s ability to meet the first milestone so quickly. The initial objective of gpi is to improve the speed of cross-border payments (starting with same-day) and improve transparency with new end-to-end payment tracking. SWIFT staffers roamed the exhibition hall with iPads demonstrating the gpi’s new payment tracker. It remains for banks to integrate the new payment type into their corporate digital channels and to determine product pricing.​


  • PSD2 and UK Open Banking:  Technology providers, especially those that offer core banking systems along with payments technology, are working closely with regulators and industry groups to enhance their product offerings to accommodate the third-party account information access and payment initiation provisions of PSD2, along with the UK’s Open Banking API Framework. Looking beyond mere compliance, both providers and banks are developing value-added services to capitalize on the significant disruption arising from opening traditional banking capabilities to third-parties.
  • Blockchain in Corporate Banking:  After publishing a Celent report on use cases for blockchain in corporate banking earlier this year, I was heartened to hear “real world” blockchain announcements from the big tech companies, touting their banking collaborations. Swiss bank UBS is working with IBM on a project to replicate the entire lifecycle of an international trade transaction. The FX settlement service, CLS, is building a payments netting service that will enable cash trades on IBM’s Fabric blockchain. Bank of America and Microsoft announced their intent to build and test blockchain applications for trade finance.   Although much progress is being made by blockchain consortia, banks, and technology providers, most people I talked to believe that significant adoption of blockchain for corporate banking use cases is still a few years in the future.

I’m off next week to attend the Annual Association for Financial Professionals (AFP) conference, hoping to bring back developments in the world of corporate treasury and treasury management.

Stop Throwing Money at Cybersecurity

cyber-operational-risk-150x1501 Most cyberattacks succeed because of weaknesses in people, processes, controls and operations. This is the definition of operational risk. Therefore, it makes sense to tackle cyber risk with the same tools you use to manage operational risk.

We continue to prove that the approach of the IT department managing cybersecurity is not working. Cyber risk is typically treated in parallel with other technology risks; the IT department is motivated to focus on securing the vulnerabilities of individual system components and proffers a micro view of security concerns.

My new Celent report on Treating Cyber Risk as an Operational Risk: Governance, Framework, Processes and Technologies”, discusses how financial institutions are advancing their cybersecurity practices by leveraging their existing operational risk frameworks to centralize, automate and streamline management, technologies, processes, and controls for a sounder and more resilient cybersecurity.

The report identifies and examines the steps required to achieve a risk-based approach to a sustainable and, ultimately, a measurable cyber risk management strategy:

1. Establish a long-term commitment to drive a top-down, risk-based approach to cybersecurity.

2. Recognize that the traditional approach of the IT department managing cybersecurity is limited and that most cyber risks are weaknesses in people, processes, controls, and operations.

3. If you have not already, consider deploying the NIST cybersecurity framework and tailor the framework to fit your individual cybersecurity requirements. The framework lets you take advantage of your current cybersecurity and operational risk language, processes and programs, industry standards and industry best practices. Both cyber and operational risk should be informed by and aligned with the institution’s enterprise-wide risk management framework.

4. Move your organization along the cybersecurity maturity curve by building dynamic risk models, based on shared industry data and assumptions, to measure and monitor cyber threats and pre-empt those attacks.

5. Stop throwing money at the problem. Educate decision-makers on why and how breaches happen. Do not purchase in siloes or under pressure, select the right expertise to identify the issues and carry out due diligence on products.

6. Use the NIST’s five functions to navigate and manage cybersecurity technology requirements and purchases.

7. Know what technology you want from your vendors; know what advice to seek from your consultants.

8. Acknowledge that cybersecurity is the responsibility of every employee and human behavior is the most basic line of defense. Institutions cannot hesitate in the goal to educate their employees, third parties and customers.

The Mobile Banking and Payments Summit – Impressions from Day 2

A couple weeks ago I attended the Mobile Banking and Payments Summit in NYC for the first time.  There was an impressive list of experts from institutions such as JPMC, Barclays, Citibank, BNP Paribas, the Federal Reserve, USAA, Capital One, BBVA, and Moven, among others. I was only able to attend the final day, but it didn’t disappoint.  The day focused mostly on mobile wallets, with a few main points shared below:

  • Mobile wallets have been challenged by industry barriers:  The old rule of thumb with a payments scheme is that it needs to please three parties: the merchant, the bank, and the consumer.  These products and solutions have traditionally fallen short of one or more of these objectives, essentially stalling a lot of the progress.
    • There’s still plenty of fragmentation in the market:  Android is an open system utilizing Host Card Emulation (HCE), while Apple is a closed system using a secure element.  There are others beyond that, but it’s largely contributed to a lack of standardization and unimpressive overall adoption.  We know this is largely understood by banks and merchants, and many are willing to play along for the time being.
    • Consumers can misunderstand mobile wallets: Many users of Apple Pay, for example, have a poor understanding of how the system actually works, with many assuming Apple is in control of their card details.  While the system is safer than traditional cards, the perception that it’s less safe is keeping many users from adopting it.
    • Getting the marketing right is tough: Often, the mobile wallet really isn’t about the payment so much as the experience around the payment.  It might be easier or there might be a whole host of incentives like rewards wrapped around it.  The potential is there, but until recently the market hasn’t been.
  • But many barriers are beginning to fall away, and there’s hope for adoption: For years, the industry has been declaring that FINALLY this year will be the year mobile wallets take off.  The industry has been crying wolf for a long time, but there are some promising developments that hope to make mobile wallets a larger share of the payments universe.  Currently in the US, 55% of merchants have updated their payment terminals, and 70% of consumers have chip cards.  The chip card does a lot for security, but the argument is that it adds friction to the checkout experience.  With the card dip taking away from the user experience, the expectation is that mobile wallets will finally offer enough UX improvement over traditional cards that consumers might opt for them during payment.  It’s also reported that more than 50% of millennials have already used a mobile wallet at least once.  This includes Apple Pay, Android Pay, or Samsung Pay.  The growth in adoption with younger consumers is a good sign that broader adoption might not be too far behind.

My colleague Zil Bareisis has written about this quite a bit, and agrees that adoption could be driven by the emergence of EMV as well as an increase in handsets that support wallet payments.Wallets are also striking partnerships to add value, including introducing merchant loyalty, coupons, etc.The launch of Walmart Pay is a great example of a retailer applying these concepts internally, facilitating even greater adoption. For more information see any of the number of reports Zil has written on the topic.

  • Midsize institutions have a few paths to follow implementing a mobile wallet: Banks want to be a part of the adoption, but have so far taken a wait and see approach, unsure about the potential of existing wallets, and still trying to figure out what it means for them as the issuing bank. There are three primary ways a midsize or smaller bank can try to launch a wallet:
    • Building an internal wallet: This provides the most control, customization, flexibility of functionality, and control over the release schedule.  The drawbacks are that it can be a complicated task, a large investment is required, the institution needs sufficient subject matter expertise in-house, and there would be no Apple NFC support.
    • Buying a turnkey white label wallet: A turnkey solution would have the benefit of being plug-and-play, there would be some customization options, functionality would be built in, fewer resources would be involved, and the vendor would provide some subject matter expertise.  There would, however, be less control over the product, the wallet could be processor dependant, and the roadmap wouldn’t be controlled by the institution.
    • Participating in an existing wallet: For many this is the road that will result in the largest adoption.  The options are fairly universal, with Samsung, Apple, and Android offering networks here.  Its plug and play, easy to get traction, includes a lot of choice, and frictionless.  The drawbacks are mainly the lack of customization options or control over the direction of the wallet.

We often say that we go to these conferences so that our subscribers don’t have to.  This is just a short summary of the day, and obviously there was much more detail shared. We encourage all of our readers to attend these events, but will be there in case they can’t make it.

Key Takeaways from Sibos 2016

Having just returned from the whirlwind that is Sibos, I (along with many other industry observers) feel compelled to contribute my two cents on the top takeaways from the event, along with one observation on the mood. Nothing about Sibos can be exhaustive, but three key areas stood out: Cyber, PSD2, and Open Banking / APIs.

Cyber was the first topic mentioned in the opening plenary address. Its seriousness brought into stark relief by the $81mm Bangladeshi incident (something my cab driver in Boston asked about on the way to the airport!), Cyber was a focus throughout the conference. While it has long been an important issue, it has catapulted to the top of the agenda of every member of SWIFT’s ecosystem given the recognition that the system is only as secure as its weakest node.

PSD 2 is often thought of in a retail banking context, but its implications will carry over to the corporate side as well. There are two critical points: 1) Banks must make their customers’ data accessible to any qualified third party, and 2) Third parties can initiate payments. These changes will have profound second-, third-, and even fourth-order effects that can scarcely be imagined today. Banks are thinking through what they need to do to comply, as well as what their strategies should be once they’ve implemented the necessary (and not inconsequential) technology changes. For a primer on the current state of PSD2, see Gareth Lodge’s recent report on the subject.

Open Banking is enabled by APIs. While PSD2 is certainly accelerating the concept, it would have been gaining momentum even without the external pressure. There are simply too many activities that can be done better by third parties than by banks, and the banks have realized that they need frictionless ways to tap into these providers. APIs are a critical mechanism to enable this interaction. Technology, of course, is a necessary but not sufficient condition for success; banks must be culturally able to integrate with new partners quickly and flexibly.

On a final note, the mood was pragmatic. The atmosphere wasn’t one of consternation, panic, or confusion. Instead, the buzz was focused, purposeful, and businesslike. Bankers and their service providers are ready to roll up their sleeves and get the job done instead of wringing their hands about all of the possible ill-fated futures that could arise. We at Celent look forward to the progress to come in 2017. What are your thoughts?

Impressions from Finovate Fall 2016

A few weeks ago I attended Finovate Fall 2016 with a few different colleagues of mine in New York.  For those who’ve never been, Finovate hosts three main events (New York, San Francisco, and London) where more than 70 fintech companies are able to present new concepts, services, or products in a rapid 7-minute format.  […]
Continue reading...

Where Will We See You Again?

When the leaves start falling, it usually means one thing for Celent analysts – the conference season is getting into full swing and it’s time for us to hit the road big time. The team is already busy at SIBOS this week, with BAI and AFP coming in a few weeks. Personally, I am looking […]
Continue reading...

The Evolving ACH Landscape

We’ve been tracking blockchain, distributed ledgers, etc for a number of years, and we’ve always been enthusiastic with the promise…but pointed out that it isn’t quite there yet, at least for payments. An announcement today caught our eyes: "The Innovation Engineering team at Royal Bank of Scotland has built a Clearing and Settlement Mechanism (CSM) […]
Continue reading...

US EMV Migration: Looking for the Silver Lining in the Clouds

It would be easy to assume that the migration to EMV in the US has gone terribly. The press is full of stories about slow transactions, inconsistent customer experiences and slow merchant adoption. Whilst not living this day-to-day, I also experienced this frustration first-hand on my trips to the US earlier this year; I wrote […]
Continue reading...