James M. O'Neill

About James M. O'Neill

James O'Neill is a senior analyst with Celent's Banking practice. His areas of expertise span all major channel and back office banking systems, particularly in corporate banking, systems architecture of legacy systems, and the impact of cloud computing.

Celent Model Bank Awards 2017: The Legacy Perspective

Celent Model Bank Awards 2017:  The Legacy Perspective

We’re less than two weeks away from this year’s Insight & Innovation Day!

Once again my presentation of the Model Bank Awards for Legacy Transformation will serve as the only barrier to the much anticipated announcement of Celent’s Model Bank of the Year Award for 2017.  This is my third year speaking at our Insight & Innovation Day conference, and this is my third year “bringing up the rear”.   I’m not sure if this status reflects on the critical importance of Legacy Transformation, in the way that “Actress in a Leading Role” presages the much anticipated “Best Picture” Award at the Oscars.  More likely, this simply reflects the reality that most banks consider Legacy IT as boringly reliable and not very…well… sexy. 

Legacy IT is a term that most bankers associate with mainframe-based core banking systems (CBS) – systems that have never seemed to get much respect.   Most bankers place Legacy IT on the “pay no mind” list, thinking about CBS platforms as much as they ponder the Ethernet cables traversing the space in the ceiling above their cubicles.  This pay-no-mind approach seems to work well, until the day when the bank experiences a massive payments screw-up, or a systems availability issue arises, triggering a Greek chorus of blame from the press and industry pundits — "damn those aging back-office systems"!  

Given my tempered expectations, I was astonished at the number and diversity of very interesting projects that were nominated for Model Bank Awards for 2017. If this year’s nominations are any indication of what’s going on in the market, it appears that banks are finally looking to transform legacy IT from something they simply have to live with to something from which they can create competitive advantage.  Of course, CBS platforms will continue working in the background to support the bank’s strategic mission – the perennial nominee for “Best Supporting Actor/Actress” rather than “Best Actor/Actress” – but it’s also very clear that innovations like real-time payments and Open Banking will only go as far as the bank’s back-office systems will carry them.

My review of the various nominations we received this year reflects a shift in how large FIs around the globe are viewing CBS transformation:  while traditionally CBS renewal was viewed as addressing a “problem to be fixed”, increasingly it’s becoming an “opportunity to be seized”.  The old view of CBS projects was driven by the simplistic notion sense that COBOL was bad simply because it was old.  To the extent that a bank’s motivation was based simply replacing the old for the sake of modernity, it explains why many CBS renewal projects have been abandoned or never attempted at all.  I’d personally prefer my bank to be running on an old and well-maintained system than an new and poorly designed banking system.

The important nuance here is that if a bank cannot achieve internal consensus around the issue of legacy CBS, it’s relatively easy to continue to kick the can down the road – which is what many large and strategically important global banks continue to do.  On the other hand, the banks included in our group of Model Bank Award nominees view CBS renewal through a different lens, one that considers CBS renewal as an enabler of operational agility, a catalyst of back-office efficiency, and other important positive benefits.  Thus, few references were made this year to "old versus new", but rather to things like “delivering banking services anytime, anywhere, at scale and using technology to relentlessly drive efficiency”, as one Model Bank Award nominee articulated the emerging opportunity for CBS renewal.

Reflecting the diversity of the Model Bank Award nominations we received for Legacy Transformation, this year we will be making three separate awards for innovative projects that are powered by innovative CBS platform implementations:

  • Legacy Transformation:  This Award winner exemplified the long-term approach that CBS transformation demands, particularly for the large banks, and demonstrated that legacy transformation is not a sprint measured on a quarter-by-quarter basis but a marathon that takes many years to take root.
  • Banking in the Cloud:  Casting aside conventional doubts and concerns about the security and regulatory acceptability of cloud services, this Award winner built its entire stack of banking services wholly in the cloud, and thus serves as a model for other banks to observe and emulate in time.
  • Financial Inclusion:  CBS transformation projects are typically aimed at increasing organizational agility and reducing back-office IT costs, however this Award winner has demonstrated the social impact of building modern IT platforms in reaching the unbanked with a growing array of modern banking systems.

I’m looking forward to presenting these very worthy winners with their Model Bank Awards, while also sharing my observations regarding how the conventional view of Legacy Transformation needs to evolve along with all of our thoughts and preconceptions regarding the importance and nature of innovation in banking.

Fintech’s Beneficiaries: Two Approaches to Regulation

Fintech’s Beneficiaries:  Two Approaches to Regulation
British Prime Minister Theresa May visits the United States this afternoon to address a gathering of Republican lawmakers in Philadelphia, followed by a visit to the White House tomorrow.  Tomorrow’s meeting is noteworthy, as Prime Minister May will be the first foreign leader to meet with President Trump since the latter’s inauguration only last week. The timing is also interesting, as only two weeks ago outgoing President Obama’s National Economic Council (NEC) released a new whitepaper called A Framework for FinTech.  The NEC, a policy advisory unit of the White House established in 1993, proposed 10 high-level principles designed to move the US fintech industry forward. The FinTech whitepaper resulted from the White House’s FinTech Summit in June, 2016 that brought together a wide range of bankers, policy makers, and other interested parties, and subtext of the whitepaper was that cooperation between all stakeholders would yield greater innovation in financial services, as summed up below..
“[A] policy strategy that helps advance fintech and the broader financial services sector, achieve policy objectives where financial services play an integral role, and maintain a robust competitive advantage in the technology and financial services sectors [will]  promote broad-based economic growth at home and abroad.”
Innovation in financial services has been on the agenda of the British government dating back to 2002, when the UK Competition Commission concluded that lowering the barriers to entry in the provision of financial services to small and medium-sized enterprises (SMEs) for competitors would improve service and lower prices paid by SMEs.  The 2002 report spurred on additional studies by various UK regulators regarding the impact of industry consolidation in banking on the outcomes for retail and SME customers. Fast-forward to February of 2016, when HM Treasury published a report of the Open Banking Working Group (OBWG) that essentially mandated many of the recommendations made by the 2014 Fingleton Report that talked about use cases and potential benefits of open APIs to drive innovation in banking and expand competition.  A blog entry by my colleague Patty Hines represents an excellent summary of this report. So while both the US and the UK governments promote innovation and growth in  fintech, they come at it at a slightly different angle, as is seen in the August, 2016 follow-on report of the UK’s Competition and Market’s Authority (the successor regulator to the Competition Commission).
“[O]lder and larger banks do not have to compete hard enough for customers’ business, and smaller and newer banks find it difficult to grow. This means that many people are paying more than they should and are not benefiting from new services.”
Even as this statement hints at subtle differences in policy goals, thankfully there’s no need to take one side or the other, as ultimately innovation in financial services can achieve both goals.  Whether creating customer advantage is a stated goal or merely a collateral benefit of fintech, the movement towards opening up the banking system through more accessible APIs will ultimately benefit not only the consumer, but the financial institutions themselves. Clearly, banks need to continually work on sharpening their game in the use of emerging technologies in order to maintain their competitiveness, but for the moment the dance floor remains open for those who choose to embrace innovation rather than fear the change that is to come.

ORACLE: Swinging the Bat in Cloud Services

ORACLE:  Swinging the Bat in Cloud Services

It's hard to believe that an entire month has gone by since Oracle OpenWorld in San Francisco — but baseball fans will have noticed that things have been a bit hectic here in Chicago of late.  Ironically, the Chicago Cubs clinched a playoffs berth on September 18th, the very day that Larry Ellison officially opened OPEN World with his first of several Keynote presentations.



My primary motivation for attending OpenWorld was to get an update on Oracle's two banking platforms — the new flagship Oracle Banking Platform (OBP) aimed at large retail banks and its stable mate FlexCube, the universal banking platform deployed by nearly 600 banks globally.  After three days at OpenWorld, I realized that the real story of interest to banks is Oracle's emerging cloud story, which coupled with its existing core banking applications business puts them in a really interesting position to transform the core banking systems market.

Now a robust 72-year-old, Larry joked with the audience about being prohibited from climbing the stairs to the Keynote stage, but he still exhibits the intense competitive burn that served his company well during the ERP battles of the 1980s and 1990s.  The difference is that these days, he's less focused on IBM or SAP as he is two new challengers:  Amazon Web Services (AWS) on the IT infrastructure side and WorkDay on the applications side.

Of course, what AWS and WorkDay have in common are that they are both businesses with long-term prospects predicated on the continued growth and development of cloud services.  Larry's first Keynote noted that we are witnessing generational change as companies move from "lots of individual data centers" to a smaller number of "super-data centers called Clouds".  In a separate presentation, Oracle CEO Mark Hurd shared Oracle's view that within the next ten years, 80% of corporate-owned data centers will have been closed, with the remaining data centers running at 20% of today's capacity, running legacy workloads that are not easily ported to a cloud services environment (hey COBOL — I think they're talking about you!).

mark-hurd-data-center-stats

According to Larry, Oracle's "overnight success" in cloud services began ten years ago when it started reengineering its original ERP products — licensed software designed primarily for the on-premise market — into a new multi-client, multi-tenant architecture, as befitting a company that was pivoting to the emerging SaaS model.  At OpenWorld, Larry shared how Oracle was extending its original SaaS business to embrace both the Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) models — putting AWS, Google, and Microsoft all squarely in Oracle's competitive sights.

Of the three, AWS appears to be the primary target of Oracle's competitive ambitions:  Oracle's new "Gen2" IaaS platform offers a virtual machine (VM) that offers twice as many cores, twice as much memory, four times as much storage, and more than ten times the I/O capacity of a comparable AWS VM.  But there is a catch, according to Larry:  "you have to be willing to pay less" than what AWS charges for a comparable VM.  (While AWS might take issue with Larry's claims about performance and value, what is clear is that Oracle is planning a serious competitive challenge to AWS's supremacy in the IaaS race.)

In Larry's words, "Amazon's lead is over.  Amazon's gonna' have some serious competition going forward."

larry-aws

This represents great news for the growing number of banks that have gotten past the question "Why cloud?" and have moved onto the more interesting question "How cloud?".  For the largest banks like Capital One that have significant IT development capabilities in-house and with the willingness to experiment with new technologies, AWS makes a lot of sense.  Banks can roll their own code, spool up a VM, and off they go.  Capital One's cloud journey has been so compelling in fact, that the bank is in the process of closing down 5 of its 8 data centers while swinging many workloads to AWS.  Most banks, however, are not Capital One. 

For the mere mortals among us — banks coping with practical limitations on their ability to develop and host banking apps — having an IT partner with demonstrable experience on the application side and the infrastructure side can represent a real game changer in terms of the bank's appetite to make a leap into cloud-based banking services.  While some banks have in the past been a bit overwhelmed by Oracle's ambitious sales pitch featuring its all singing, all dancing suite of integrated applications ("that's very impressive, but I only want a G/L!"), with Oracle's new IaaS offering a bank could mix and match Oracle applications (offered via SaaS) with third-party and in-house developed systems.  That potentially a game changer for banks interested in cloud services, but overwhelmed by the complexity of going it alone with a public cloud provider.

That brings us back to OBP and FlexCube.  OBP is a recently built Java-based core banking solution built for the needs of large scale retail banks.  As such, OBP is aimed squarely at mainframe-based core platforms like Hogan, Celeriti and Systematics.  FlexCube is a universal banking platform and has also seen some renewed investment from Oracle in the last few years.  While today its primary market appears to be international banks, as a modular solution FlexCube can address specialized needs in the US market like cash management and trade finance. OBP and FlexCube continue to compete in the global core banking systems market on their own terms — with OBP having some recent success as a foundation for a new digital banking platform for Key Bank (Cleveland) and as the foundation of a complete core replacement project at National Australian Bank (Melbourne).

For larger banks intrigued by the promise of cloud services, but daunted by the complexity of building and operating their own environment, the opportunity to pull down an OBP license that is hosted in the Oracle Cloud while dragging other applications from their private data center to Oracle's IaaS platform could help achieve in one move the twin goals of core banking system and data center transformation.  That's a rare 2-for-1 in a world where a widely-held truism is that every IT decision involves trade-offs among alternatives.

According to Larry, Oracle's annual revenue run rate for cloud is currently about $4 billion.  Amazon recently announced that its revenue run rate for cloud services was north of $10 billion while just yesterday Microsoft announced its own revenue run rate for cloud has approached $13 billion.  These are undoubtedly different businesses (AWS is more or less pure-play IaaS while Microsoft skews towards SaaS by virtue of the strength of its Office 365 business), so I won't pretend to make any apples-to-apples comparisons.  The point remains that while it's still early in the cloud ballgame for Oracle — with deep financial resources, an impressive portfolio of banking applications, and Larry's intense "Will To Win" against the current market incumbents — bank CIOs need to pay close attention to what is going on in Redwood Shores.

A Tale of Two Cities (Conference Season)

A Tale of Two Cities (Conference Season)

Spring is usually the season for conference travel, and this season has been no exception.  While my colleagues were spread out across the US covering conferences in Boston, New Orleans, and other locations, I spent four days in beautiful Barcelona covering the Temenos TCF 2016 Conference, followed by another three days in sunny Orlando covering the FIS Connect 2016 Conference.  The fine warm weather was not the only thing that these two conferences had in common, as Temenos and FIS each took the opportunity to showcase their recent investments in new enabling technologies.

Barcelona

In Barcelona, CEO David Arnott kicked off TCF 2016 — themed Solutions For a Connected World — by examining the Company's strategic expansion from what has traditionally been a narrow focus on banking software.  Temenos's transition to its broader focus on financial services was signaled by its acquisition in March, 2015 of Multifonds, a provider of software and services to the third-party fund administration business that supplies many of the large players in the fund administration market, including large global FIs like JP Morgan, Citigroup, BNP Paribas and Credit Suisse.

David Arnett TCF2016

Temenos's strategic expansion makes sense in that its flagship core banking platform T24 has traditionally been aimed at the universal banking market, so fund administration represents somewhat of a contiguous market for them.  Speaking of T24, I was a bit surprised at how far T24's re-architecture has come, with John Schlesinger (Chief Enterprise Architect) sharing that an architectural overhaul of T24 will be complete by Release 17 (the solution is currently on R15).

By the time R17 is complete next year, T24 will have been refactored to create a series of frameworks that will facilitate customization of the platform by client banks, allow easier integration with third-party services, and make the transactional data held by T24 more accessible for data analytics.  While most of the work here was under the hood and did not impact T24's existing features and functionality, the overhaul was much needed to expand the solution's reach into larger banks, where architectural flexibility is very important.

The T24 overhaul appears to have already borne fruit, with the Swedish retail bank Nordea Bank signing up last September to implement T24 across its regional base of 10.8 million retail and 500,000 corporate customers in several Nordic countries.  Clearly one of the benefits of the architectural upgrade is to improve T24's run-time performance, and John shared that T24 has been benchmarked to support 40 million accounts within an end-of-month processing window of only 2 hours and 56 minutes — an impressive result given that T24 operates on IBM's pSeries server running Windows or Linux, not on the mainframe systems that continues to dominate within large banks.

Orlando

In Orlando, the theme of FIS Connect 2016 was Empowering the [Financial] World, and it was clear that FIS hopes to leverage some of the new products and technologies from SunGard last November in enhancing the traditional banking and payments offerings aimed into the larger banks that were in attendance.  (FIS's community banks had their own conference called InfoShare back in April.)

Gary Norcross FIS Connect 2016

While traditional cash management products like CashExpress (data exchange in support of cash concentration activities) had their familiar place on the Connect 2016 exhibit hall, new services like Ambit Treasury Management for bank treasury departments and FIS's new SWIFT Service bureau offering rounded out the already comprehensive set of solutions FIS has assembled for the corporate banking needs of its clients.

Fiserv drove consolidation in the bank technology outsourcing market in the 1980s and 1990s before handing the baton to FIS in the new millennium, starting with the acquisition of Alltel Information Services in 2003 and continuing with the addition of Metavante Corporation in 2009 and SunGard in 2015.  With each acquisition, FIS has generated financial and operational synergies aimed at creating shareholder value, so it was interesting to see how FIS's now multi-year enterprise technology initiative is now beginning to create technological synergies for the benefit of its bank clients.

The exhibit hall showcased some of the first fruits of FIS's program to create new technologies that can support its bank clients across the range of core banking platforms, from the community banking oriented platforms like Horizon and BancPac to larger bank systems like IBS and Systematics.  FIS's Enterprise Customer Experience suite and the new version of its TouchPoint sales and service platform (supporting the branch and call center) were both built to be core platform agnostic and both represent a functional upgrade from the "native" CIF/CIM and sales/service modules associated with FIS's individual core platforms.

By creating new functional upgrades on an enterprise basis rather than through individual core platform enhancements, FIS is hoping to get more bang for its R&D buck, push out new product enhancements to its bank clients more quickly,  and redeploy precious IT funding into long-term banking service innovation.  Even ten years ago, FIS's strategy would have been inhibited by the relatively inflexible programming tools available to bank IT developers.  Today, with the advent of micro-services and standardized banking APIs, FIS has learned that product and services differentiation can finally be made compatible with a single code base.

That looks to be a win-win for FIS and its clients, but as with all things in life execution is key as the enterprise technology initiative grows in scope to cover other important parts of the banking IT system.  Stay tuned!

 

The banking railroad of innovation: Follow the river

The banking railroad of innovation: Follow the river

I'm a big fan of the old movie classics. The TMC channel was a loyal companion during my graduate school days at the University of Illinois, offering a comforting black and white backdrop to frequent all-day programming sessions, and today I frequently call on TMC to get me through my daily hour-long treadmill sessions.

This weekend TMC offered up Jimmy Stewart as railroad detective Grant McLaine in 1957's Night Passage. A classic Western, McLaine was fired in disgrace over a railroad robbery carried out by his estranged brother, only to be offered a second chance to prove his loyalty to the railroad by being the courier for a large cash payroll being sent to the workers at the rail head.

Night Passage Poster

Grant's companion during the critical train ride to the rail head was young Joey.  Riding with Grant on a flatbed car as the train twisted and turned through the Rocky Mountains, Joey asked Grant how the railroad builders knew the best route through the harsh terrain.  This question gives Jimmy Stewart the rare opportunity to showcase his singing and accordion-playing skills as he responds by singing a song called "Follow The River".  The song ends with the chorus:

"Follow the river,
Wherever you may be,
Follow the river back to me."

Just as the railroad builders used the river to guide the design and layout of the early railroads, bankers have used technology to guide how banking services are designed and built.  In an interesting bit of historical irony, the first use of machine-based bank processing was being rolled out by the Bank of America just as Night Passage was hitting the movie theaters.

The system was called ERMA (Electronic Recording Method of Accounting), a machine-driven approach to electronically reading checks and processing the bank's accounts.  ERMA was co-developed by Bank of America and the Stanford Research Institute, launched in 1958, and was able to process 50,000 accounts per day.  While ERMA's initial capacity was small by today's standards, in those days, it represented an outlandish number in comparison with 10,000 accounts per month that BOA estimated it could process using existing paper-based manual methods.

ERMA ushered in the era of Big Iron in banking (a term also used to describe railroad locomotives), as improvements in the speed and capacity of what we today call the mainframe computer facilitated the rapid growth of the large banks during the 1960s and 70s.  Mainframe computers running programs powered by Rear Admiral Grace Hopper's newly developed Common Business Oriented Language (COBOL) became the river that banks followed when planning and building new banking systems like Electronic Payments (EFT), Electronic Tellers (ATM), and others to meet emerging customer demands.

Mainframe computers are interesting from operational processing perspective in that data (specifically customer accounts and daily transaction data) takes a while to load, but once loaded accounts can be processed at a lightning-fast rate.  While ERMA could process only 50,000 accounts in a day, modern mainframes can process millions of accounts in a matter of a few hours.  COBOL itself as a programming language was scorned nearly from Day One by the computer science cognoscenti as a crude and unstructured way to build an enterprise system. 

In 1975, a respected Dutch computer scientist named Edsger Dijkstra made the famous comment that: "With respect to COBOL you can really do only one of two things: fight the disease or pretend that it does not exist, " before concluding, "the use of COBOL cripples the mind; its teaching should therefore be regarded as a criminal offense."  Despite the withering criticism from academia, mainframe vendors and banks moved forward on the basis that the systems simply workedThroughput is the key to understanding how high-volume banking systems and today's railroad system works. 

A case in point is the Canadian National railroad's purchase in 2007 of the Elgin, Joliet & Eastern Line (EJE) to facilitate its rail connection of parts east and west through Chicago.  While the distrance from Gary, Indiana to Waukegan, Illinois is only 70 miles by car, CN now connects these points using EJE's 198 miles of track.  This makes no apparent sense until you consider that CN is now able to route cross-country trains around the busy hub of Chicago, where previously CN endured a variety of operational restrictions and traffic jams arising from the many at-grade crossings through the congested urban core.  To CN, routing traffic around Chicago rather than through Chicago resulted in more throughput and fewer train delays, more than compensating for the additional mileage.

And so it has gone for the banking processing. The use of oft-criticized COBOL and the unique operating characteristics of mainframe computers was tolerated as there were no other alternatives for banks requiring reliable processing at very high scale. That is, until recently.

Just as the river in Night Passage twisted and turned through the Rockies, the path of technological progress has twisted in an unexpected way to many bankers, as cloud services are now challenging the hegemony of mainframe-based banking systems. While a top of the line mainframe computer can be purchased with more than a 100 lightning fast processors, a bank can "rent" thousands, even tens of thousands, processors for 10 minutes, 10 days, or 10 years. Using software that is tuned to manage the distributed processing of bank accounts across thousands of virtual machines, banks can now meet and exceed the enormous throughput of their mainframe computers at a fraction of the cost.

The king of mainframe computing, IBM, clearly understands and has responded to the changing role of the mainframe in banking.  During the 50th Anniversary celebration of the mainframe in 2014, IBM rolled out its new vision of the mainframe as an uber-sized cloud server, allowing for the hosting of several thousand virtual machines at one time.  Last summer, IBM upped the ante with the annoucement of IBM LinuxONE Emperor, a z13-based server allowing for up to 8,000 virtual machines to be hosted on a single machine.

While banks have experimented with cloud services to varying degrees, most of the innovation has taken place at the channel services level, with new online and (particularly) mobile banking applications getting a technology refresh through the unique benefits of cloud services.  While each bank will need to build its own business case for the gradual porting of COBOL-based account processing systems to modern programming languges that are "cloud-ready", it is clear that cloud-based account processing will allow the level of agility in product development that is increasingly called for as channel and payment systems continue to evolve.

Cloud-backed innovation in back office systems has been slow to develop, with many banks citing security and the fear of regulatory issues as inhibitors to adoption.  As the recent two-part Celent report Banking in the Cloud:  Between Rogues and Regulators establishes, regulators in fact do not have any objections to banks hosting their banking services in the cloud, provided that banks follow the same standard of care (including encryption, access controls, data masking, etc.) that they manage for in their own data center.

In time, I expect that the banking railroad will continue to follow the river of innovation that is now leading us directly into the age of cloud services. The proven yet inflexible COBOL-based systems that have served the industry reliably for 50 years will be replaced with agile and cloud ready account processing platforms that will over time both reduce costs and the drive service quality improvements that banks will need to compete and survive in the increasingly competitive world of financial services.

The iPhone, the FBI, and the lessons for bankers

The iPhone, the FBI, and the lessons for bankers

With today’s news comes the interesting development that the FBI has apparently used a “tool” acquired from an unnamed third-party white hat security firm to gain access to the locked iPhone of one of the San Bernardino shooters without requiring Apple’s cooperation.  This issue had been the subject of a recent tug-of-war between Tim Cook and the US Department of Justice.

While FBI Director James Comey has been mum on the details, some in the IT security community have speculated that the new tool employs a so-called “brute force attack” on the iPhone by sequentially guessing the device’s passcode until the device unlocks itself.  While the lock-out feature is user-configurable, an iPhone running the current version of iOS will normally give the user 10 chances to input  the passcode correctly before permanently locking the user out while deleting all user data from the device.

Cloud services to the rescue.  The speculation is that the newly acquired FBI tool was able to get around this measure by simply cloning the software from the perpetrator’s iPhone — including the operating system and all of the user data files — hundreds or thousands of times and performing what is effectively a “distributed brute force attack” by repeatedly guessing passcodes from a master checklist across the clones in parallel.  When an individual clone became locked, that clone is discarded and the tool continues the guessing game with other clones on a reduced list of candidate passcodes until one of the guesses finally works.

The likely reason why the FBI has apparently succeeded is the fact that the perpetrator’s passcode was static, meaning it didn’t change during the course of the many times that the FBI tried one guess after another.  (In this context, it was important that the perpetrator was caught, as otherwise  he would have changed his passcode and/or wiped the data remotely, a capability that Apple provides to all iPhone users.)

What does this have to do with banking security?  As demonstrated by the success of the FBI’s  new white hat tool in breaking Apple’s device security, the simple reality of data protection is that no encryption technique is foolproof, particularly from the threat of a brute force attack.

Given the power of the cloud to solve a large computational problem like guessing an large encryption key using a cloud-based “divide and conquer” approach, bankers need to pay attention to the need to employ strong encryption keys while rotating their keys on a regular basis.

The definition of “regular basis” will depend on the sensitivity of the data to be protected, but one thing is for sure:  the bank that creates an enterprise encryption key once and thinks the bank is protected forever is dangerously vulnerable to a future cyber attack based on a distributed brute force technique such as the one that was quite possibly used by the  FBI’s white-hat vendor.

Given the importance of encryption to maintaining a safe and FFIEC-compliant environment for the safekeeping of NPI, and especially in light of the emergence of  services like Blockchain that are dependent on encryption for success, banks ought to be paying close attention.

Yahoo! is for sale: why banks should care

Yahoo! is for sale: why banks should care
The rollercoaster that is Yahoo! continues. Yesterday, the company officially announced that it was putting itself on the selling block, in a move aimed at holding off an aggressive activist hedge fund called Starboard Value. It was only in December when management shared the stunning news that Yahoo! was planning to spin itself off (more precisely its core Internet businesses) to its shareholders. The announcement in December came on the heels of a nearly 12-month project aimed at spinning its 15% interest (worth $30 billion) in Alibaba, the Chinese e-commerce company, to its shareholders, a transaction that has been abandoned over tax concerns. By spinning out the Alibaba stake to Yahoo!’s shareholders, Marissa Mayer and the Yahoo! board hoped that shareholders could benefit from the Alibaba investment while Yahoo!’s management could focus on rebuilding the company’s core Internet businesses. Rebuilding is the correct word here. Founded in 1994 and going public in 1996, Yahoo! once lived a charmed life as the so-called “originator” of the search engine. In fact, Yahoo!’s original business represented a searchable directory of websites curated by Yahoo! staff. It was Google that improved on Yahoo!’s original idea by deploying technology that could automate the building of a website directory by using bots to crawl the web, catalog the content of websites, maintain an searchable index of the result, and most notably calculate the importance of a website that reflected the number of inbound links from other websites. Ironically, Yahoo! responded to Google’s innovation quite awkwardly, first partnering with Google, then walking away from the partnership in 2004 as it sought to exploit the technology of acquired businesses such as Inktomi (2002) and AltaVista (2003). After a dalliance with Microsoft’s Bing in 2010, Yahoo! finally came back to Google earlier this year, signing a three-year partnership in October. What can banks learn from Yahoo!’s adventures? It’s very simple:  innovation is a game that is played for a full 9 innings. Yahoo! was a public company for two years before Google was even founded, and the company at one point enjoyed a market capitalization of more than $100 billion. Today, Google’s market cap is more than $480 billion while the market cap of Yahoo! is less than $30 billion, which is slightly more than the current value of its holdings in Alibaba. So with full benefit of hindsight, Yahoo!’s original idea to offer a curated list of interesting websites was itself innovative, but it was Google’s use of automation in capturing and cataloging the rapidly growing content of the web that fueled the revolution that drives much of the global economy today.  As the legendary British venture capitalist Sir Ronald Cohen once argued in his book The Second Bounce of the Ball:
“We can all see where the ball in bouncing today, but few of us try to anticipate where tomorrow’s bounce will be, and even fewer will attempt to take advantage of it. “
The forward-thinking banks that heed the lesson of Yahoo!’s current troubles will stop worrying about the pressure coming from the current crop of Fintech upstarts and will focus on that second bounce of the ball, the place where the real opportunity lies.  There is still plenty of time left in this game.

Dispatch from Vegas: Capital One places big bet on AWS

Dispatch from Vegas:  Capital One places big bet on AWS
If it’s October, it must be conference season.  Old DominionThe month started innocently enough: a visit to Nashville for the Computer Services, Inc. annual client conference.  By virtue of endless industry consolidation over the past 30 years, CSI has been initiated as a full member of the Big-5 fraternity of core banking systems providers, and Celent will be adding their flagship NuPoint banking solution to our pending updated coverage of core banking systems solutions.  I was pleased to be invited to speak to CSI’s clients about innovation in banking (more about that in a future blog post).   Entertainment was provided by an up and coming country group Old Dominion — I’d never heard of them, so was surprised at their excellent performance of songs that they had written for established acts like Blake Shelton (“Sangria”) and Tyler Farr (“Guy Walks Into A Bar”). After Nashville, it’s been back-to-back trips to Las Vegas for the Amazon Web Services re:Invent developers conference followed by the Bankers Administration Institute’s Retail Delivery Conference (BAI-RDS). BBKingFor a long time, BAI-RDC has been the premiere conference for retail banking.  When I was busy digging up acquisition opportunities for Metavante in the 2000s, BAI-RDS was a “can’t miss” opportunity to take the temperature of fintech, to see what competitors were up to, and especially to keep tabs on the many startups that had emerged from the shadows to lead the way in internet-enabled banking services.

Those were very heady days for BAI-RDS.  I have vivid memories of packing into the House of Blues in New Orleans as Chip Mahan, founder of online banking pioneer S1, invited a few hundred of his industry friends to a private performance by BB King.  It was November 29th, 2000, a Wednesday evening and yet the party went on well after BB finished up his performance at 11 PM.

Back to Vegas.  Since I also cover cloud services for Celent, I decided to check in on what AWS was up to these days.  Their annual developer’s conference is called re:Invent and since AWS has only been doing this for the past four years, I didn’t quite know what to expect.  BAI-RDS regularly draws 3,000 attendees, and while I knew re:Invent 2012 drew about twice that number, I was still not prepared for the crowd of nearly 20,000 developers and AWS partners that converged on the Venetian Hotel and Sands Expo.  The many specific education sessions were scattered over the five floors of ballrooms in the Venetian while the Expo Hall and Key Note presentations were held at the adjacent Sands Expo. While I didn’t see many bankers wandering the halls of AWS re:Invent, the one banker I did see grabbed my attention:  Capital One appthat was Rob Alexander, CIO of Capital One, who shared the stage with AWS SVP Andy Jassy during the Day One Keynote address.  Rob was there to announce that Capital One is deploying its new flagship mobile banking app on the AWS Cloud — I found that nothing less than startling in that Capital One only started experimenting with AWS last year, running a few mobile app development projects and bank-sponsored hack-a-thons in the AWS Cloud. Based on its initial success, Capital One began migrating development and testing work to AWS at the beginning of the year, and nine months later it was sufficiently happy with their experience that the bank made the bold decision to shift part of its production environment to AWS, beginning with its new mobile banking app. The new app essentially melds Capital One’s existing online and mobile banking applications, with a uniform look and feel, and changes to user preferences made on an iPhone or iPad automatically flow to the user’s online banking experience.  Capital One’s API gateway and 80 individual banking services are in the process of moving to the AWS Cloud as part of the mobile banking services launch, initially on the iPhone and later this fall expanding to the iPad and Android platforms. What’s The Hurry?   Surely Capital One is no start-up — with more than 70 million cards and $80 billion in card balances, Capital One is a top-four credit card issuer.  When combined with its direct banking operations, Capital One is in fact the sixth largest bank in the United States, with $350 Billion in assets.  Even as a proponent of the long-term impact that cloud services will have on the banking business, I was nothing less than astonished that Capital One has progressed from cloud newbie in 2014 to going “all in” on AWS in 2015. It didn’t take long to see what Capital One was up to.  By leveraging AWS for DevOps and (over time) production, Capital One is on track to reduce the number of data centers it owns and operates from 8 in 2014 to 5 by 2016, to only 3 by 2018.  Capital One intends to redeploy the capital it will recoup from data center consolidation into its core businesses while increasing the scope and pace of innovation at the bank. Capital One is betting that an AWS-based mobile banking platform will allow the bank to support the level of real-time scalability needed to cope with demand spikes such as occurs on Black Friday and Cyber Monday. But what about security?  Security is the most commonly cited reason why most banks are not embracing cloud services, so I was interested to hear Capital One’s take on security.  According to Rob, “Of course, security is critical for us.  The financial services industry attracts some of the worst cybercriminals.  So, we work closely with the Amazon team to develop a security model which we believe enables us to operate more securely in the public cloud than we can even in our own data centers.” More securely in the cloud?  Either Capital One has gone rogue (very doubtful) or it knows something that most banks have yet to reconcile:  when it comes to security, it’s much less about where your sensitive data sits and much more about how you secure your data from pranksters and thieves. AWS Party AWS’s re:Play evening entertainment was provided by Zedd, a Russian-German musician and DJ who I also had never heard of (although my 18-year old college student did).  Chip Mahan was nowhere in sight, but I might have missed him in the crowd of 19,000 AWS converts in the audience.  Zedd was no BB King for sure, and unlike Old Dominion I didn’t know any of his songs, but nevertheless his techno-pop performance was hypnotically entertaining. I could have gone back to Las Vegas for a third consecutive week, where the ever-growing Money 20-20 conference beckons, but alas I’ll be in Denver for the Association of Financial Professionals annual conference.  Dan Latimore and Zil Bareisis will be at Money 20-20 and I’m eager to hear about their experience.  

The new face of digital banking

The new face of digital banking
I’m just back from a very interesting week in London at the Marketforce-sponsored “The Future of Digital Banking” forum.  I served as Chairman for Day 1 of the two-day conference and had a front-row view of the proceedings. The theme of the conference was that customer-centricity and innovation need to be the guideposts to a bank’s transformation from a bricks-and-mortar operation into a digital enterprise. Perhaps my favorite presentation of Day 1 was offered by Dr. Nicola Millard, the Head of Customer Insight & Futures at British Telecom (You mean the phone company? Yes, that BT!). Nicola has a PhD in Human-Computer Interaction, a very hot area of IT these days as the focus is on UI organization and the impact on user experience. According to Nicola, when dealing with the increasing demands by clients for digital service provisioning, banks should are assume that clients are self-centered (it’s all about me!), believe that all banking services should be extremely easy to use, and yet require quick access to a live person when all else fails (live chat is preferred). One stat that struck me was that 90% of BT’s clients would like the ability to email the customer service agent that they had spoken with on the phone – not sure how this would work out in banking, where phishing is always a concern – but still great food for thought. On the heels of the conference came news that Atom Bank, a new challenger bank in the UK, had received its full license from the Bank of England.  Sophie Haagensen, Head of Strategy and Planning for the Durham-based bank, had participated in a panel on Tuesday and had indicated that Atom Bank would seek to leverage its position as a legacy-free direct bank (no physical branches) to build a new model for customer engagement. As if to drive home the point, Atom Bank announced that it will initially be operating in “mobile only” mode, with online banking to be rolled out at a later date. It was also announced that FIS was selected as Atom Bank’s IT outsourcing partner. The specific core banking platform selected was not disclosed, but it’s likely to be Profile, the real-time system that FIS leads with in the international market. This is certainly good news for FIS, which had been looking to put some points on the board since global rival Fiserv announced in July of 2014 that it was launching a new outsourcing service called Agiliti (based on its well established Signature core banking platform), with Think Money Ltd. as its launch client. With more than two dozen firms having applied for banking licenses in the UK, it appears that the competitive heat of summer has finally arrived in London.

London calling: “payments are missing”

London calling: “payments are missing”
From a brief scan of the morning headlines comes news of an IT glitch at the Royal Bank of Scotland. RBS Help announced via its Twitter account at 5:45 AM London time yesterday that “some customer payment are missing this morning — we are investigating this issue as a matter of urgency,” and further advised that affected clients could call or visit their branch if they “have been affected and need to access funds today.” Six hours later, RBS reported that the underlying issue had been fixed and that all missing payments – reportedly 600,000 payments directed to client accounts at RBS, NatWest, Ulster Bank, and Coutts – would be restored no later than Saturday. While certain questions remain (like what sort of a payments glitch would take a three full days to resolve?), to RBS’s credit, it attempted to get ahead of the issue quickly in order to allow customers to mitigate any problems that might arise from the bank’s glitch. Somewhat predictably, some industry observers are already pointing to RBS’s mainframe-based “legacy” payment processing systems as the root cause of the problem. Huh?  The logic appears to be that systems that are old can’t operate reliably, simply because they’re old.  One observer went so far to claim that the lack of courage was the key reason why banks delayed needed modernization projects (OK, courage was not exactly the word used, but you get the point). To put this in perspective, RBS is said to have roughly 20 million retail clients in the UK.  RBS has yet to make an official statement on the matter, and so we’re left guessing whether the 600,000 payments impacted by the glitch represented a small part, large part, or all of the bank’s payment flow on Tuesday.  (I’m guessing it’s a small part, but that’s no consolation to those who were impacted). The last time a similar outage in RBS’s payments system occurred (in June of 2012), the culprit was found to be a problem with a systems upgrade, specifically a piece of software that handled the processing of payment batches. RBS has not yet officially commented on the cause or scope of the glitch this time, so much of what has been written in the last 24 hours is little more than speculation. What is clear is that most mainframe systems do work reliably on a daily basis – RBS is said to process on average more than 20 million payments a day – and systems rarely fail spontaneously of their own accord. So when RBS does comment on the situation, I will be looking to read the tea leaves for signs of the human element at work here – inadequately tested system upgrades or simple human error that was not immediately detected by the bank’s internal controls. That’s not to say that banks should ignore the opportunity to continually update and improve their legacy processing environment (they ought to!), but it does seem a bit cartoonish to treat every processing glitch as another call to action for wholesale systems modernization. The banks that do embark on systems modernization projects are not Cowardly Lions returning from the Land of Oz.  Rather, they are forward-looking institutions that seek market leadership through the delivery of better products and a more satisfying banking experience for the customer. For these banks, innovation is the goal and IT agility is the route to their destination. Where front-office innovation leads, back-office legacy will follow, and so it will be in time with the mainframe.  In the meantime, let’s show some respect to our long-serving and trustworthy friends. Let’s stay tuned…