Joan McGowan

About Joan McGowan

Joan McGowan is a senior analyst with Celent's banking practice. Her research focus is on risk management, regulatory compliance, data security, and data analytics, with a concentration on disruptive technologies for risk transparency and data visualization. She brings a "risk-reward" perspective to strategic decision-making.

Robots Offer Helping Hand to Fraud Investigators

Robots Offer Helping Hand to Fraud Investigators

Banks continue to be plagued by extremely high false positives. (They report anywhere between 75% and 90% false positive rates across their fraud and AML transaction monitoring systems.) False positives trigger alerts that then have to be resolved by fraud investigators. It is long standing and frustrating problem. Treating false positives, which in essence are low level alerts, require daily repetitive work by highly specialized employees. For each alert, the investigator has to toggle across multiple internal and external sites copying and pasting links and taking screen shot of information as evidence. It is a humdrum and error prone task. And, more crucially, it takes valuable time away from the investigation of high risk alerts.

So it’s good to see yesterday’s press release from @NICE_Actimize introducing robotic process automation (RPA) into the world of financial crime investigations. Actimize has integrated RPA into its case management solution. The solution deploys attended and unattended robots onto the investigators' screens to automate routine tasks involved in resolving alerts and cases.

I must note that @Pegasystems offers an RPA solution for repetitive back office processes for fraud resolutions. Several other vendors have RPA for financial case management on their roadmaps. However, it is far from being a mainstay of a bank’s financial crimes technology stack.

RPA is a low cost technology that has the potential to substantially increase the productivity and accuracy of AML and fraud investigations. Actimize’s ambitious goal is to potentially increase investigator productivity up to 50%. Whatever the percentage, the end result should be that investigators have the wherewithal to be proactive in the deterrent of suspicious activities.

It is not a stretch for banks to deploy RPA together with machine learning and natural language processing to automate not only repetitive tasks but to carry out simple, judgment-based tasks in the resolution of cases or filing of SARs/CTRs.

Actimize’s solution offers both attended and unattended robots. Attended robots are accessed by a tab on the investigator’s screen and, when required by the investigator, can support their daily tasks. Unattended robots are deployed in the background to quietly and quickly complete predefined routine tasks.

Unattended robots can perform checks on the queue of alerts to ascertain which alerts can be closed without exceptions. In most banks this can be up to 50% of the alerts in the queue. An unattended robot can automatically call or message the customer for further information and perform the process fulfillment activities of the outcomes of investigations such as issuing a new card or sending a letters.

Attended robots can be used on demand for such tasks as copying and pasting or navigating between systems and screens to help the investigator complete the evidence gathering processes.

As banks focus on operational efficiencies across risk platforms, it seems to me that Actimize’s goal is doable.

 

Celent Model Bank Awards: Fraud, Risk Management, Process Automation and Flub-Free

Celent Model Bank Awards: Fraud, Risk Management, Process Automation and Flub-Free

It is my privilege to be part of the judging panel for Celent Model Bank Awards for 2017 for the following three categories:

  • Fraud Management and Cybersecurity – for the most creative and effective approach to fraud management or cybersecurity.
  • Risk Management – for the most impressive initiative to improve enterprise risk management.
  • Process Automation – for the most effective deployment of technology to automate business processes or decision-making.

A common theme across this year’s submissions for the above categories is the importance of agile technology, digital process automation, and consistent and focused practices across the organizations. A large number of the entries show that a streamlined and automated operational risk framework is critical to run a successful risk management program. Everything connects and has a consequence and unless banks can join the risk dots across their ecosystems, they will continue to spend at a very high rate with unsatisfactory and, at times, devastating results.

Improved data analysis and machine learning capabilities also featured prominently in the winning case studies. A central data platform, automated processes and improved insights have produced notable increases in efficiency, better control of costs, reduced resourcing requirements, reduced errors and false positives and have made it easier for the banks to adapt to their digital footprint, an expanding cyber threat landscape, and intense and complex regulatory obligations.

Hopefully, no flubs on the big day

Without exception, every submission is of a high-quality and we found it a daunting task to pick the most worthy award recipients. In the end, we are excited and confident about our selection of winners in the above categories, yet we are sorry that we could not recognize so many others that clearly also deserve recognition.

At the moment we are staying tight-lipped about who won the awards. We will be announcing all winners publicly on April 4 at our 2017 Innovation & Insight Day in Boston. In addition to presenting the award trophies to the winners, Celent analysts will be discussing broader trends we’ve seen across all nominations and will share our perspectives why we chose those particular initiatives as winners. Make sure you reserve your slot here while there are still spaces available!

 

How to Woo a Bank

How to Woo a Bank

When it comes time to choose a business partner, banks will favor those who help them execute their third party risk management (TPRM) responsibilities over those who begrudgingly comply.

The risk to a bank of doing business with a third party is real; the consequences of a risk event are not only disruptive, but often result in long-term reputational damage that can seriously affect the bottom lines of both the bank and the third party. We have all seen the media coverage. Parties who can make TPRM easier for banks by being proactive, transparent, and helpful will distinguish themselves in an ever more competitive environment.

They must show that they are compliant with the bank’s risk management requirements throughout the RFP, due diligence, onboarding processes, and lifecycle of the engagement.  OCC1 TPRM regulations alone require the bank to evaluate 16 risk dimensions when engaging with a third party. And, if the relationship involves a high or critical risk activity, the bank will carry out a much more thorough due diligence; often including an on-site visit to inspect operational risk procedures in the case of a risk event.

Furthermore, there is now an expectation that the third party will willingly take a portion of the liability of such an event.

Banks are introducing a new level of discipline and quantification around the measurement of third part risk. With this knowledge, banks can determine third party indemnification provisions and allocation of liabilities at the contract stage. You will be at a disadvantage if you do not have a way to measure and verify the scope of a potential risk event that involves your products or services.

Celent is also beginning to witness the inclusion of provisions within contracts that require a third party to reimburse the bank for out-of-pocket costs relating to data security breaches that occurred due to the third party's negligence. As banks continue to push back on third party risk liabilities, third parties need to ensure they have in place insurance policies that can fund indemnification obligations.

My recent two research reports discuss the changing and expanding landscape for TPRM and explain why banks, regulators and third parties need to commit to their significant other in the management and responsibility of risk.

Banking Third Party Risk Management Requirements are a Big and Expensive Ask

Banking Third Party Risk Management Requirements are a Big and Expensive Ask

Celent, through its work with Oliver Wyman, estimates the cost to US financial institutions of undertaking due diligence and assessment of new third party engagements to be ~ $750 million per year. Institutions are paying three times as much as their third party to complete on this exercise. The average cost to an institution to carry out due diligence and an assessment of a new critical third party engagement is $15,000 and takes the institution approximately 16 weeks to complete.

The top ten US banks average between 20,000 and 50,000 third party relationships. Of course, not all of these relationships are active or need extensive monitoring. But the slew of banking regulatory requirements for third party risk management is proving to be complex, all-consuming and expensive for both institutions and the third parties involved. In a nutshell, institutions are liable for risk events of their third and extended parties and ecosystems. The FDIC expresses best the sentiment of worldwide regulators:

“A bank’s use of third parties does not relinquish responsibility… but holds it to the same extent as if the activity were handled within the institution." www.fdic.gov

If an institution doesn’t tighten its third party risk management, it is significantly increasing the odds of a third party data breach or other risk event and will suffer the reputational and financial fallout.

In the first report of a two-part series, just published by Celent, “A Banker’s guide to Third Party Risk Management: Part One Strategic, Complex and Liable”, I show how institutions can take advantage of their established risk management practices such as the Three Lines of Defense governance model, and operational risk management processes to identify, monitor and manage the lifecycle of critical and high-risk third party engagements across functions and levels. It describes the components required for a best-practice program and shows examples of two strong operating risk models being used by the industry that incorporates third party risk management into the enterprisewide risk management program.

Unfortunately, there are few institutions that have successfully implemented strategic third party risk management programs. Most institutions fall between stage 1 and 2 of the four stages of Celent’s Third Party Risk Management Maturity Curve. But continuing to operate without a strategic third party risk management practice will leave your institution in the hands of cyber fate and the regulators.

Stop Throwing Money at Cybersecurity

Stop Throwing Money at Cybersecurity

cyber-operational-risk-150x1501 Most cyberattacks succeed because of weaknesses in people, processes, controls and operations. This is the definition of operational risk. Therefore, it makes sense to tackle cyber risk with the same tools you use to manage operational risk.

We continue to prove that the approach of the IT department managing cybersecurity is not working. Cyber risk is typically treated in parallel with other technology risks; the IT department is motivated to focus on securing the vulnerabilities of individual system components and proffers a micro view of security concerns.

My new Celent report on Treating Cyber Risk as an Operational Risk: Governance, Framework, Processes and Technologies”, discusses how financial institutions are advancing their cybersecurity practices by leveraging their existing operational risk frameworks to centralize, automate and streamline management, technologies, processes, and controls for a sounder and more resilient cybersecurity.

The report identifies and examines the steps required to achieve a risk-based approach to a sustainable and, ultimately, a measurable cyber risk management strategy:

1. Establish a long-term commitment to drive a top-down, risk-based approach to cybersecurity.

2. Recognize that the traditional approach of the IT department managing cybersecurity is limited and that most cyber risks are weaknesses in people, processes, controls, and operations.

3. If you have not already, consider deploying the NIST cybersecurity framework and tailor the framework to fit your individual cybersecurity requirements. The framework lets you take advantage of your current cybersecurity and operational risk language, processes and programs, industry standards and industry best practices. Both cyber and operational risk should be informed by and aligned with the institution’s enterprise-wide risk management framework.

4. Move your organization along the cybersecurity maturity curve by building dynamic risk models, based on shared industry data and assumptions, to measure and monitor cyber threats and pre-empt those attacks.

5. Stop throwing money at the problem. Educate decision-makers on why and how breaches happen. Do not purchase in siloes or under pressure, select the right expertise to identify the issues and carry out due diligence on products.

6. Use the NIST’s five functions to navigate and manage cybersecurity technology requirements and purchases.

7. Know what technology you want from your vendors; know what advice to seek from your consultants.

8. Acknowledge that cybersecurity is the responsibility of every employee and human behavior is the most basic line of defense. Institutions cannot hesitate in the goal to educate their employees, third parties and customers.

Security, fraud, and risk Model Bank profiles: Alfa Bank and USAA

Security, fraud, and risk Model Bank profiles: Alfa Bank and USAA

Banks have worked hard to manage the different risks across their institutions. It has been and will remain costly, time consuming and a top priority. Celent profiles two award-winning banks who have modelled excellence in their use of risk management technologies across their banks.

They demonstrated:

  1. Degree of innovation
  2. Degree of difficulty
  3. Measurable, quantitative business results achieved
(Left to right, Martin Pilecky, CIO Alfa-Bank; Gary McAlum, SVP Enterprise Security Group USAA; Joan McGowan, Senior Analyst Celent)

(Left to right, Martin Pilecky, CIO Alfa-Bank; Gary McAlum, SVP Enterprise Security Group USAA; Joan McGowan, Senior Analyst Celent)

ALFA-BANK: SETS THE STANDARDS FOR BASEL COMPLIANCE IN RUSSIA

Alfa-Bank built a centralized and robust credit risk platform to implement Basel II and III standards, simultaneously, under very tight local regulatory deadlines. The bank decided to centralize all corporate credit-risk information onto a single platform that connected to front office systems and processes. Using Misys FusionRisk, Alfa-Bank was able to implement a central default system with a risk rating and risk-weighted asset calculations engine. The initiative is seen as one of the most important initiatives in the bank’s history. The successful completion of the project has placed Alfa-Bank at the forefront for setting standards and best practice methodologies for capital management regulations for the Russian banking industry and Central Bank.

USAA: SECURITY SELFIE, NATIVE FINGERPRINT, AND VOICE SIGNATURE

The game-changer for USAA is to deliver flawless, contextual customer application services that are secured through less intrusive authentication options. The use of biometrics (fingerprint, facial and vocal) to access its mobile banking application positions USAA to be able to compete with Fintechs across the digital banking ecosystem and offer exceptional service to its military and family members.

USAA worked with Daon Inc. to provide biometric solutions paired with its “Quick Logon” dynamic security token technology, which is embedded in the USAA Mobile App for trusted mobile devices. Biometric and token validation focus on who the user is and who the verifiers are and it addresses increasing concerns around the high level of compromise of static user names, passwords, and predictable security questions from sophisticated phishing attacks, external data breaches, and off-the-shelf credential-stealing malware.

For more information on these initiatives, please see the case study abstract on our website.     

Large FIs spent $25M rolling out failed risk management frameworks during the 2000’s. So why try again?

Large FIs spent $25M rolling out failed risk management frameworks during the 2000’s. So why try again?
Large financial institutions spent in excess of $25 million on rolling out failed enterprise risk management frameworks during the 2000’s. So why try again? Well for many obvious reasons, the most notable of which has been the large scale failure of institutions to manage their risks and the well-editorialized consequences of those failures. The scale of fines for misconduct across financial services is staggering and damage to the banking industry’s reputation will be long-lasting. Major Control Failures in Financial Services blog Source: publicly available data Regulators and supervisors are determined to stop and reverse these risk failures, specifically, the poor behavior of many bankers. Regulators are demanding that the Board and executive management take full accountability for securing their institutions. And there is no room for failure. This is the only way that risks can be understood and, hence, managed across the enterprise. There is no denying that risk management frameworks are hard to implement but Celent believes the timing is right for the industry to not only secure their institutions and businesses but to innovate more safely and, slowly, win back the trust of their customers. My recently published report Governing Risk: A Top-Down Approach to Achieving Integrated Risk Management, offers a risk management taxonomy and governance framework that enables financial institution to address the myriad of risks it faces in a prioritized, structured and holistic way. It shows how strong governance by the Board is the foundation for a framework that delivers cohesive guidance, policies, procedures, and controls functions that align your firm’s risk appetite to returns and capital allocation decisions.

Proposed new cyber security regulations will be a huge undertaking for financial institutions

Proposed new cyber security regulations will be a huge undertaking for financial institutions
New York State Department of Financial Services (NYDSF) is one step closer to releasing cyber security regulations aided by the largest security hacking breach in history, against JP Morgan Chase. The attack on JPMorgan Chase is revealed to have generated hundreds of millions of dollars of illegal profit and compromised 83 million customer accounts. Yesterday (Tuesday, November 10), the authorities charged three men with what they call “pump and dump” manipulation of publicly traded stock, mining of nonpublic corporate information, money laundering, wire fraud, identity theft and securities fraud. The attack began in 2007 and crossed 17 different countries. On the same day as the arrests, the NYDSF sent a letter to other states and federal regulators proposing requirements around the prevention of cyber-attacks. The timing will undoubtedly put pressure on regulators to push through strong regulation. Under the proposed rules, banks will have to hire a Chief Information Security Officer with accountability for cyber security policies and controls. Mandated training of security will be required. Tuesday’s letter also proposed a requirement for annual audits of cyber defenses. Financial institutions will be required to show material improvement in the following areas:
  1. Information security
  2. Data governance and classification
  3. Access controls and identity management
  4. Business continuity and disaster recovery planning and resources
  5. Capacity and performance planning
  6. Systems operations and availability concerns
  7. Systems and network security
  8. Systems and application development and quality assurance
  9. Physical security and environmental controls
  10. Customer data privacy
  11. Vendor and third-party service provider management
  12. Incident response, including by setting clearly defined roles and decision making authority
This will be a huge undertaking for financial institutions. Costs have yet to be evaluated but will be in the millions of dollars. It will be very difficult to police third party security because, under the proposal, vendors will be required to provide warranties to the institution that security is in pace. The requirements are in the review stage and financial institutions should join in the debate by responding to the NYDFS letter.

IBM’s Cognitive Bank: Big Data, bigger problems

IBM’s Cognitive Bank: Big Data, bigger problems
Last Wednesday I attended IBM’s analyst presentation on Transforming Banking and Financial Markets with Data. The crux of the presentation was the benefits of big data and cognitive analytics for financial markets. The return from better understanding the desires of an individual bank customer are well understood and IBM did a good job of illustrating the up-lift. But what were not discussed are the daunting challenges and complexities a bank will face in implementing and managing a big data project. The implementation and ongoing management of data will make or break the success of cognitive computing. What I would like to see is an open discussion on the successes and failures of big data implementation programs by the banks, IBM, and other vendors working in this space. How smooth was the implementation process (time/budget/resourcing etc.)? Were your expectations set correctly? Did you get the required support from management? What were the lessons learnt? What value do you see from your big data program? It’s not easy Structured data tends to sit in multiple databases housed in silo-ed legacy systems; it is customized, lacks consistency, has incomplete fields, is often latent in nature and is prone to human error. All of which compounds the complexity of managing the data. Add to structured data the volume, variety, and velocity (known as the 3 Vs) of unstructured data and the challenge of implementing and managing information becomes even greater. And, the larger and more complex the bank the more likely its data architecture and governance process will hinder data-based implementations projects. Automating the management of data is time consuming and laborious and scope creep is significant, adding months onto implementation projects as well as extra expense and frustration. Resourcing such projects can be taxing as there is a limited pool of big data expertise and they are expensive. To perform cognitive analytics, massive parallel processing power is required and the most cost-effective operating environment is through the cloud. If you get the data right, cognitive analytics can be very powerful. Cognitive analytics Cognitive analytics (also referred to as cognitive computing) is a super-charged power tool that allows data scientists to crunch vast amounts of structured and unstructured data and to codify instincts and learnings found in that data in order to develop hypotheses and recommendations. Recommendations are ranked based on the confidence the computer has in the accuracy of the answer. How you rate confidence was not made clear by IBM and I would argue that this can only come after the fact, when you can use KPIs to validate the scoring and criteria. The modeling techniques include artificial intelligence, machine learning and natural language processing and, unlike us mere mortals, the more data you feed the computer, the higher the quality of the insight. If you do get it right, the rewards are significant We continue to leave behind mind-boggling amounts of digital information about our lifestyles, personalities, and desires. A sample of sites where I know I have left a hefty footprint include Facebook, Reddit, LinkedIn, Twitter, YouTube, iTunes, blogs, career sites, industry associations, search history patterns, buying patterns, geo locations, and content libraries. IBM Watson offers banks a cost-effective way, through the cloud, of scouring such data to build up clues that provide a more in-depth view of what their customers’ desire. Current analytic segmentation is requirements-based and is modeled on past behavior to determine and influence future behavior. The segmentation buckets are broad and all within them are treated the same. Cognitive analytics allow a much more precise and immediate analysis of behavioral characteristics in different environments and, therefore, a more personalized and satisfying experience for the customer. I’d welcome any feedback from those of you who have been involved in implementing or are in the process of implementing big data in banking. And, if you’re interested, take a look at Celent’s Dan Latimore’s blog Implementing Watson is Hard On a side note, IBM introduced the term Cognitive Bank and it is not a phrase that works for me. It is disconcerting to describe a bank as having the mental process of perception, memory, judgment, and reasoning. Looking forward to hearing from you.