When $250 Million Can’t Buy Cyber-Peace

Last week’s newspapers brought the unsettling news that JP MorganChase’s internal CRM systems were penetrated by unknown attackers, compromising the personal information of 76 million households and 7 million small businesses. The Bank had released a statement to its clients on Thursday noting that “there is no evidence” that account numbers, ATM PINs, or social security numbers were accessed during the cyber attack. Today, news reports indicate that four other large financial services companies including Citibank and E*Trade were targeted by the same group, thought to be based in Eastern Europe or the Middle East.  In the case of JP Morgan Chase, the investigation has been focused on the personal computer of a single employee whose system may have been compromised by malware. The incident continues to be investigated by the FBI, Secret Service, and JP Morgan’s own private vendors, so there’s no need to speculate on who is responsible or what other information may have been compromised in the attack.  Still I hesitate to note that the Bank’s soft “no evidence” qualifier gives it plenty of wiggle room should the investigation uncover additional data leakages. The point here is that like the two other large data breaches of 2014 — Target and Home Depot — the JP Morgan Chase breach occurred in its private data center, the kind that is built at significant cost to resist these sorts of attacks – or at least detect and repel them when they do. JP Morgan’s annual report shares that the bank spends more than $250 million annually on cybersecurity, and will have 1,000 employees focused on the task by the end of this year.  Most banks do not have the size or management scale to match JP Morgan Chase’s annual investment, but if even $250 million can’t buy cyber-peace, what chance do average sized banks have of protecting themselves from the next malware du Jour? I contrast this situation with the growing use of cloud services in the financial services industry.  While other industries have been quick to embrace the cost, capability, and flexibility of cloud services, the banking industry lags behind — largely based on valid concerns about information security and control. JP Morgan Chase’s announcement serves as a wake-up call to banks of every size, informing them that when sensitive client data is concerned, private data centers and public cloud providers are partners in the ongoing fight for data security.  The next bubble to burst will be the long-held presumption that maintaining customer data in a private data center is inherently safer than storing it in a public cloud. To a cyber-attacker, an IP address is an IP address.  Whether sensitive customer data is located on a physical server on the bank’s premises or a virtual server located on a public cloud is mostly irrelevant.  What really matters is how well a bank (or its service provider) monitors network traffic, detects unusual or malicious activity, and shuts down suspect traffic.  The other lesson here is that as always, a little encryption can go a long way in ensuring that customer data is safe from the prying eyes of clever and determined hackers.  

Spending a day with IBM’s Watson

As an IBM alumnus (but no longer a stockholder) I’ve gotten pretty used to seeing the company do things a certain way. And then I attended a day-long “Watson at Scale (aka Ecosystem 2.0)” event on October 7 and had a lot of my old notions upended. Watson, of course, came to prominence when it won Jeopardy in 2011. Immediately after that IBM began experimenting with a select number of industries (Healthcare, Travel and Retail) to demonstrate proofs of concept and learn what works and what doesn’t. Beginning in January of 2014, Watson expanded dramatically and is now covering 26 industries. IBM proclaims that Watson is the harbinger of a new era of computing, what they call “Cognitive Computing.” There’s just too much information being created today for any single person to digest; Watson aims to “amplify” experts’ capabilities. Doctors, salespeople, and wealth managers are but a few examples. IBM says there are four key attributes to understand:
  • Watson understands natural language (computational linguistics).
  • Watson is a voracious reader
  • Watson provides recommendations with confidence levels
  • You don’t program Watson, you teach it
Mike Rhodin, the IBM SVP who leads Watson (under an unusual board-governed structure), described the key insight about Watson: it’s not that Watson gives answers, but rather that it generates hypotheses, gives confidence intervals around those hypotheses, and provides evidence trails. It does this by ingesting enormous amounts of data, being taught by humans with a series of questions and answers, and then learning on its own as it proceeds. The more data it has, the better it performs. Watson does a better job providing recommendations for people when it knows something about them. A salesperson will sell differently to an introvert than an extrovert, as a simplified example. Watson can generate a personality profile on the basis of a person’s twitter feed or blog posts – I’m not sure how accurate it is, but the concept alone is pretty startling. Terry Jones, an entrepreneur previously associated with Travelocity and Kayak, introduced a new company called WayBlazer that uses Watson’s technology. It aims to be able to answer queries like, “I want to go on a golf trip with my buddies in October,” or, “Give me an itinerary for Costa Rica in May with my two kids.” Watson might ask clarifying questions, and then would come back with recommendations. The prototype is currently in place for Austin, but a service like this highlighted clearly what Watson has the potential to do, if implemented successfully. Another lightbulb for me was Terry’s description of Watson as a liberal arts major, not a math geek. It could do airline pricing optimization, but that’s not what you’d buy it for. WayBlazer is but one example of the ecosystem that Watson is building. Realizing there’s a shortage of skills in Cognitive Computing, IBM has teamed with ten universities to offer courses on the subject; this fall all of the classes were oversubscribed. From a standing start in January, IBM has about 100 partners and expects that to continue to grow. Watson had 1 API in January; it now has 8, with more than a dozen in development. IBM may have finally figured out how to execute at startup speed under the umbrella of Big Blue. Watson’s interest in financial services is currently very focused. Insurance is one key space, particularly around underwriting. Wealth Management is the other key area, with risk and compliance being a third. You may disagree with Watson’s prioritization, but their intentional focus is spot-on – they’ve got to demonstrate some tangible successes before they begin to branch out. Based on different discussions, Watson’s revenue will come from four sources:
  1. Consulting to investigate and establish what Watson will do for the firm
  2. Priced products (e.g., oncology)
  3. SAAS revenues from running Watson for individual projects
  4. A cut of the revenue that partners earn from Watson projects
What’s ultimately different this time? In this new IBM (a place the company has been forced to by intense competition), Watson:
  • Is playing the role of an ecosystem platform
  • Is using partners to reach consumers, realizing that IBM’s strength is as a B2B company
  • Has built a new physical space, reversing a trend of selling real estate and having employees work remotely
  • Is not trying to do this on the cheap
  • Is focused on just a few areas
What does this mean for banks and financial services firms? The IBM take is, of course, that you’ve got to be exploring Watson or you’ll be hopelessly behind. That’s overly broad, but I recommend that firms at least get up to speed on the potential of the technology and see whether it can apply to them. We’ll have to watch to see if Watson carries through on its promise, but efforts like this are a necessary (if not sufficient) first step in the right direction for IBM.  

Google Wallet Relaunches and Takes on PayPal at Its Own Game

They say, “imitation is the sincerest form of flattery.” In my report last year I contrasted Google Wallet and PayPal as representatives of two fundamentally distinct approaches seeking to win in the battle to bring mobile payments to the high street. Not anymore – having failed to ignite the market in the first 12 months and its first incarnation, Google Wallet re-launched yesterday with a revised approach, essentially taking a leaf out of PayPal’s book. Unlike PayPal, the “Google Wallet 2.0” will continue to focus on NFC technology. However, instead of storing all the payment credentials on the secure element inside the phone, it is moving most of them into the cloud, leaving inside the phone only a prepaid account, which is based on MasterCard’s PayPass and can be used anywhere where PayPass is accepted. The prepaid account is linked direcly to any of the debit or credit payment cards (MasterCard, Visa, Amex, Discover), which the customers can register themselves, just like they would register a card as a funding source for a PayPal account. More details on the new Google Wallet here. So, what does this mean and who are going to be the winners and losers? It’s early days, of course, but here are some of my preliminary thoughts:
  • Consumers, Google and payment networks, especially MasterCard, are likely to emerge as the winners here. Consumers are now in control and can register and manage their cards directly with Google, independent of their banks. They will have to learn to trust Google, which has some work to do to re-establish its image after the initial security concerns. However, as and when consumers come on board, this will be good news for Google and its card partners.
  • While Google continues to stick with NFC for the “last mile” technology, MNOs will continue to have a say in this game. However, this set up now lays the ground for Google to potentially decide to bypass the secure element, and the MNOs, in the future altogether.
  • The impact on banks is likely to be mixed. Most banks didn’t want to play with Google when it was offering the opportunity to digitalise their payments credentials directly and remain in control of the payments portion of the transaction. Now, while the bank cards will continue to be part of the transcation, they are clearly taking the back seat and will have to deal with Google as a “merchant of record” for their transactions. True, they won’t have to incur the extra costs of provisioning their card credentials on to secure element, but that would also rule them out from participating in other NFC ventures, such as Isis.
  • The biggest unknown is the impact on merchants. And that’s because the transaction economics are no longer obvious for Google Wallet and is a question I am most keen to find out more about. In the initial set-up Google was clear that they would not take a cut on the payment transaction and the merchant would have paid a standard fee depending on the card used. Now, from the merchant point of view, they are accepting a prepaid MasterCard, while it might an Amex card that actually funds the transaction. PayPal deals with it by having direct acquiring relationships with its merchants and offering them a discount rate which represents an expected blend of funding transactions. Does it also mean that Google Wallet will have to establish relationships with the acquirers to re-coup from merchants any potential differences in transaction costs? Or will it have to charge the end user for “loading” their wallet, something that other prepaid card providers do for card-based re-load transactions?
It’s Day 1 after re-launch and, naturally, there are more questions than answers. Only time will tell how successful Google Wallet 2.0 will be, but for now it feels like a step in the right direction, at least from Google’s perspective.

Bank Mobile Wallets: NFC or Cloud?

Extensive travel tends to wreak havoc on the usual patterns and the best intentions. As a result, I haven’t yet had a chance to blog about an interesting development first announced a couple of weeks ago. FIS, a large technology and services provider, has announced a new m-payments system, developed in partnership with Paydiant, a mobile technology company. Celent clients may recall my recent report, “What’s In Your Mobile Wallet? Winning the Battle for Mobile at the Retail POS” where I described the four major domains which represent the key battlegrounds for bringing mobile payments to the physical stores in the developed markets. In that report, I suggested that banks are in danger of losing control over POS payments to cloud-based wallet providers, such as PayPal and others. I also said that NFC, despite all the concerns around infrastructure and business models, represents the best chance for banks to keep their payments credentials used at the POS in the mobile world. With the announcement from FIS, it seems that banks can take on the cloud-based wallet providers at their own game. FIS and Paydiant developed a cloud-based solution that can be integrated into the bank’s mobile app and simply requires downloadable apps for consumers and retailers. Because the app resides in the cloud, no payment credentials need to be exchanged at the POS, giving everyone an additional piece of mind and alleviating the retailers from PCI compliance requirements. In the demo showed to Celent in Boston, the POS terminal produced a QR code, which a consumer would scan with his app on a mobile phone, which then triggers the payment transaction. The QR code is only one possible communications technology – NFC could be used instead if both the terminal and the phone were NFC-capable. The payment is done via one of the payment instruments (e.g. a card) that the consumer has pre-registered with the app and the retailer already accepts. The app could also be developed by retailers rather than banks. In fact, the retailers might find the solution easier to implement than banks, as they can control the acceptance side. The banks wishing to use this solution must ensure that there are enough merchants that have downloaded the appropriate app and are willing to let customers use it. All of which points for the need to create and manage a new scheme, one that consumers recognise as they decide which app to pull up on their mobile phone at the POS. Still, I think it’s a very interesting solution and one that allows the FIs and retailers explore the opportunities around cloud-based wallets.

Winning the Battle for Mobile at the Retail Point of Sale

Over recent months, there has been a considerable increase in the buzz around mobile and electronic wallets in the developed markets. New wallets have been launched (e.g., Google Wallet, Amex Serve), with many more companies announcing intent to compete in this space (e.g., Visa, PayPal, Isis, and others). A number of industry leaders proclaimed (again) the end of physical wallets. Are all these new wallets fundamentally the same? If not, how do they differ? What challenges do they face? What does it take to replace a physical wallet? Who are most likely to emerge as leaders in this space? How will they compete? What does it mean for the payment industry incumbents? These are the questions I am exploring in my new report “What’s in Your Mobile Wallet? Winning the Battle for Mobile at the Retail POS,” published yesterday. One of the insights of the report is that retail POS is not just about NFC. And actually, despite all the challenges to implement NFC-based solutions, they might just offer the banks an opportunity to remain in control of merchant and consumer relationships. The alternative vision of commerce promoted by cloud-based mobile wallet providers, such as PayPal, is a lot less appealing to banks and other incumbents. The report defines the four major domains along which players will compete to bring mobile solutions to retail. It also describes the requirements mobile wallets should fulfill in order to succeed in the market and how specific features are likely to evolve. Finally, the report offers predictions on how the market is likely to develop and makes recommendations for financial institutions. Let me know if you agree with my conclusions.

Financial Services Next?

Amazon Web Services (AWS) announced AWS GovCloud, a new AWS Region designed to allow U.S. government agencies and contractors to move more sensitive workloads into the cloud by addressing their specific regulatory and compliance requirements. AWS GovCloud is an AWS Region designed to allow U.S. government agencies and contractors to move more sensitive workloads into the cloud by addressing their specific regulatory and compliance requirements. Because AWS GovCloud is physically and logically accessible by U.S. persons only, government agencies can now manage more heavily regulated data in AWS while remaining compliant with strict federal requirements. The new Region offers the same high level of security as other AWS Regions and supports existing AWS security controls and certifications such as FISMA, SAS-70, ISO 27001, FIPS 140-2 compliant end points, and PCI DSS Level 1. AWS also provides an environment that enables agencies to comply with HIPAA regulations. This is the kind of announcement I could also see for financial services, where regulators could bless a specific region of a cloud as suitable for banking data. I think this is a huge step to moving forward with truly cloud based financial services. I never thought that service providers would find it economical to make their entire cloud as secure as banks would require. Making a region of the cloud that secure is a viable option. What do you think?

Amazon’s fall from the cloud

Amazon’s cloud computing offering went down, bringing lots of other dot.coms down with it. I have never been a huge fan of cloud computing, as I explained in a previous blog post, Cloud or Fog? My concerns have always been around security. If you don’t know on which server or in which data center your data is stored, how can you be certain that it is secure? Additionally, cloud has SLAs and meets them. Amazon offers an SLA of 99.9% Is that good enough for banks? Not being able to access reddit.com or groupme.com is unfortunate, but not tragic. Not being able to access your bank is rather a bigger issue. I doubt most banks would settle for three nines. My take on the cloud for banking has always been that banks are held to a higher standard, both in terms of security and availability. A breach in either will lead to massive reputational risk. That’s why when banks go to “the cloud” they do so using service bureaus or shared data centers that have higher security and availability than required by the typical dot.com. Having said that, if amazon.com is down for an hour, consequences are enormous. Banks have outages as well, so let’s not pretend we are perfect. We do need to be closer to perfection than the average cloud consumer, and until that reality changes, I’d stick with the service bureau or shared data center for mission critical applications.

Cloud or Fog?

Working in the San Francisco, I am not unfamiliar with fog.
Fog or Cloud

Fog or Cloud

It can swirl around you and disorient you, obscuring the reality of what is really happening. I think that the hype around cloud computing is more fog than cloud. What is Cloud computing? From the Celent Report, Cloud Computing, SaaS, and Technology Outsourcing for Banks. We state that cloud computing is the use of computing resources, typically a server or part of a server, over the Internet. To amplify, this means that instead of installing a server on site, a company can take advantage of and utilize a server in some other location without having to manage (or know how to manage) the physical box. This is typically paid for on a per-usage basis over time rather than an upfront fee, meaning that a company could use the server for one hour a day and pay only for that time. In the banking industry we’ve had “Cloud computing” for thirty or forty years and it is called a service bureau. I am attending HCL’s analyst conference in Boston and was gratified to hear another voice blowing against the unrelenting storm of clouds. Vineet Nayar, CEO of HCL stated that he didn’t understand what all the fuss was about. He stated that he didn’t see anything new on the technology front around cloud computing. There are many new things that are associated with cloud computing: Server virtualization is a big deal, and provided by people like VMWare and IBM. Virtualization enables cloud computing, but isn’t new to the enterprise. Enterprises have been using LPARs (IBM’s virtual machines on the mainframe) since the advent of the System 370. This is important technology and a big deal. It isn’t cloud. We all know that server virtualization will have long lasting implications at large IT departments to increase efficiency and reduce costs. Software as a Service (SaaS) as exemplified by Salesforce.com is a subset of cloud. Software as a Service is when a vendor licenses an application to a client on demand, taking care of the management and maintenance of both the hardware and the software. The SaaS provider may be using the cloud to run its software. This is nothing but a service bureau: firms such as Metavante (now FIS) run software and systems and charge banks by per account per month. There is nothing new here. In a previous blog, I stated that I was won over to the cloud, hearing about banks using salesforce.com for account origination, and customer information. I am still a believer in this. I do believe that banks will continue to use other companies to access information, software functionality, and computing power. They have been doing so for the past thirty years. The term cloud is bandied about with great frequency, but I’m afraid that the reality is that this is more fog than cloud. Use server virtualization. Use the internet to access services from other companies. Call it private cloud, public cloud, or whatever you’d like. Just remember that when you add a lot of hot air to fog, it rises to become cloud.

Banking in the Cloud

I have undergone a conversion. After talking to a bank CIO and attending the salesforce.com Cloudforce Tour yesterday in San Jose, California, I now see a place for the cloud in banking. The bank CIO talked about moving not just prospecting, but account origination to the cloud, specifically to force.com. The tools available on force.com make for a reasonable environment and development platform for this bank. It can make sense. Seeing the hurdles that banks apply to security, I would have never thought that a horizontal provider would meet the rigorous requirements that banks require. Salesforce.com is SAS70 compliant. A year ago I wouldn’t have thought about moving real banking activity to the cloud. Today I do. Are you moving any of your banking activities other than prospecting to the cloud?

Cloud Computing and Interoperable Components

Dr. Vishal Sikka, the CTO of SAP, talks about timeless software which has four major thrusts, two of which are:
  • Cloud-based consumption of computing
  • Interoperable components: lessons learned from SOA
The challenge of the two isn’t building the cloud and building the services, but integrating the services of the service provider and service consumer. In other industries, such as oil and gas, SAP has sufficient critical mass to be a defacto standard. In banking this simply isn’t the case. There are few commonly agreed upon data structures, messaging standards, etc. SWIFT and IFX are exceptions to prove the rule. Banks don’t have plug and play interoperable components. That makes cloud based services more of a challenge. I think SAP understands the challenge and therefore spun out its business process for banking into the non-profit BIAN. From www.bian.org: The target of BIAN is to enable faster strategic and operational changes of the banking business by providing systematically defined banking functional IT services based on a broad consensus in the banking industry. Other technology vendors have signed up such as SunGard, Callataÿ and Wouters, Temenos, and Microsoft. Banks such as Deutsche Postbank, ING, and Credit Suisse have also joined. BIAN has a long road ahead to becoming a defacto standard, but with a few (more than one or two) years, could become one, not universally, but widely, adopted. That would be a huge accomplishment for BIAN. Best of luck.