Mainframe data integration for digital innovation and cloud analytics
The pressures placed on IT over the past decade are exceptional—and nowhere is it felt as acutely as at the intersection of the mainframe and digital world. Consider the crushing volume of web transactions against mainframe applications during COVID for economic relief services, and the continuing surge in cloud adoption across the corporate world and public sector.
Why is the intersection between mainframe and digital so important? It’s estimated that 80% of the world’s corporate data resides or originates on the mainframe. And every piece of this data is valuable. In finance alone, mainframes process $8 trillion in payments every year.
But now information is growing exponentially off the mainframe—especially if you consider the proliferation of cloud/SaaS apps and non-transactional data like email, blogs and instant messages. IDC projects that by 2025 worldwide data will grow to 175 zettabytes, with as much of the data residing in the cloud as in data centers.
You need to correlate existing real-time transactional data with newly created digital data to better serve customers, drive innovation and increase efficiencies. If your team ignores the wealth of information contained in the silos of legacy data, you are just as disadvantaged as those who don’t adapt to the cloud and new technologies fast enough.
Businesses and government organizations store critical data in many different systems, across a variety of platforms and locations. Data on mainframes such as IBM zSystems® and legacy platforms like OpenVMS®, iSeries (AS/400) and UNIX® can be leveraged for business intelligence as well as cloud and digital transformation initiatives. But to gain agility and insights from this data, organizations need an easy, fast and cost-effective way to access, virtualize and move the business data—wherever it resides—so it can be connected to modern applications and data platforms in the cloud.
Instead of replacing or consolidating existing systems and applications, leverage the information they contain to deliver data that can power your digital innovation. Mainframe data integration helps you take advantage of cloud-based data warehouses and other modern technologies while avoiding the costs and risks associated with migration to new platforms or databases.
What will you do to:
- Access mainframe and non-relational data, without disrupting performance?
- Accelerate cloud adoption while maintaining business stability and continuity?
- Improve time-to-market for new products, services and applications?
- Power cloud-based, data-driven apps and platforms for engagement anywhere?
- Make optimal decisions by analyzing all your best data?
Demand for valuable mainframe data
You can get tremendous value from combining the historical and real-time transactional data in your existing mainframe systems with new information from digital channels. When you see that full picture, you can make informed decisions, better serve your customers and feed data-driven apps and analytics platforms.
Mainframes are at the center of enterprise operations. They are used to make critical business transactions and are strategic to the core of your business, especially in banking, insurance, government, transportation, communications and manufacturing. Mainframe transactions create massive data sets that contain decades of insights about customers, products and your business that can set you apart.
Mainframe Momentum
A whopping 67% of North American enterprise infrastructure technology decision-makers use mainframes. In Forrester’s survey of the financial services sector, mainframe usage continues to increase, and firms are modernizing their applications to increase workloads on the platform. Of those surveyed by Forrester, an impressive 87% believe the mainframe is a viable long-term strategic platform.
As interest in cloud computing increases, there is a misconception that the value and usage of the mainframe is decreasing. Instead, we’ve seen a strong momentum of growth in IBM zSystems® (mainframe) workloads and know that core applications still matter. The mainframe continues to thrive today because its reliable, proven systems offer fast, reliable performance, with decades of investment in the unique business logic of the applications, typically built in COBOL. It’s hard to overstate the value of business continuity and trusted security.
But the mainframe cannot live alone in a silo. The demand for its transactional data, particularly for digital transformation, cloud innovations and data analytics, requires organizations to consider a hybrid architecture that allows seamless operations between on-premises applications and cloud platforms.
So who needs mainframe data integration, and why?
Power data-driven apps for anywhere engagement
SaaS, cloud apps, portals, mobile and smart devices are everywhere. Customers, citizens and partners now expect your services and information through whatever channel they choose. In essence, the focus is on engaging users anywhere with fast, self-service capabilities.
Startups are unencumbered by legacy architectures and can move fast. However, enterprises with mainframe applications have the advantage of well-established processes and a wealth of customer history data. They can exploit this competitive advantage if, and only if, they can connect their enterprise data to the channels of engagement—matching the agility of a startup and surpassing them in quality engagement.
Take, for example, a large vacation property exchange that was being outcompeted for customers by lean startups on the web and mobile. With decades of historical customer data on the mainframe, it knew it could beat the competition if they could just reach customers how and where they wanted to engage—online, anytime.
It was fast and easy to set up a web app—just like the startups—but the game-changer was using Structured Query Language (SQL) access to link the web app to the company’s core mainframe booking application database on VSAM. Customers now gained the ability to search property availability, and complete multiple registrations, bookings, and check-ins from a web browser or mobile app in milliseconds. The impact was massive new revenue, and the agility to outcompete the startups by scaling and supporting any new technology.
360-degree customer view
What better way to anticipate and meet customer expectations than to have a singular comprehensive view of a customer’s data—spanning a customer’s basic contact information, their purchase history and details on interactions they have with sales associates, customer support and digital properties? This 360-degree customer view can help decision-makers and marketing and sales teams make data-driven decisions about customer buying habits and how to best serve their customers.
Yet consolidating customer information from multiple touchpoints and systems in one place, or view, presents challenges. Purchasing and logistics data is likely processed on a mainframe, sales interactions may be tracked on a SaaS app such as Salesforce and online interactions with web content is often tracked in a cloud application. In the past, organizations tried to gain that 360-degree customer view by consolidating information into a central data warehouse, but that approach to mainframe data integration has limitations, such as the long latency between batch data uploads.
The disadvantage of delayed data consolidation was evident to a large bank’s wealth management advisors, who lacked 24x7, instant access to the financial status, personal data and portfolios of high-value customers. Its traditional data warehouse provided advisors with out-of-date information during customer calls—resulting in loss of expert authority and trust.
Data virtualization provided a more viable alternative. By creating a virtual data warehouse that could access data from multiple systems like Sybase, Oracle, IBM i (AS/400) and mainframe in real time, the advisors had access to customers’ latest trades. The impact was immediate. Customer trust soared. Advisors gained a complete view of everything they needed to know about the customer, no matter who called or when—at their fingertips, and in real time. The bank’s fortunes rose along with the wealth it managed for individuals, institutions and non-profit charities alike.
Real-time information for the mobile workforce
Marketing and customer service departments aren’t the only ones interested in providing self-service to customers. Almost every industry supports a mobile workforce that could benefit from real-time access to enterprise data. From law enforcement in the community to utility workers repairing equipment to restore services, there is a demand to provide mobile workers with the information they need, when they need it.
It’s easy to imagine the safety benefits if an officer could look up real-time police and court information about the person just pulled over for a traffic infraction to see if they have repeat violations, outstanding tickets, or even summons to attend court for an unrelated violation—right from their vehicle. This is exactly what a US State information agency is doing by creating a web app with a single sign-on system for law enforcement officers in the state to view incidents, arrest warrants, and even if they are behind on child support payments—all information that is maintained on-premises in non-relational mainframe databases as part of its criminal justice information system.
Data analytics and business intelligence
Data analysts and data engineers know that including the wealth of mainframe data in modern cloud analytics efforts can be game-changing. For example, combining historic customer purchase data with online shopping trends can help improve customer interactions and provide the foundation for new services. It can positively increase revenue by identifying patterns to improve marketing and sales initiatives. It can also save money by using that data to gain insights that improve operational planning and resource management, or uncover hidden business risks and counter fraud.
To support greater insights and better decision-making, data analysts want to leverage all enterprise data in the cloud data platforms they've selected (e.g., Snowflake, Redshift, BigQuery, Databricks, S3, ADLS2, GCS) and use the analytic tools they're most familiar with (e.g., Tableau, Looker, PowerBI, SAS, Sagemaker, Dataiku).
The goal is to turn data into business value by feeding data consumers—internal and external stakeholders, and customers—the data they need, when they need it. Those who can include mainframe data in their AI and analytic platforms are going to make better decisions, faster and stay ahead of the competition.
Challenges to mainframe data integration
Most mainframe and legacy data sources are typically non-relational and proprietary, with no common integration standard—making them hard to access and integrate. The old ways of sharing mainframe data with point-to-point integrations and manual programming won’t work anymore because the volume of data is growing and the speed of change accelerating. The differences in priority between those who maintain the data and those that need the data can cause conflicts of interests.
Mainframe data is unique
Many mainframe and legacy data stores, such as VSAM, IDMS, IMS and RMS, are proprietary and do not provide native SQL access, making it difficult for organizations to access and integrate this data for consumption on modern platforms and applications. Modern tools and ecosystems rely heavily on SQL, which is the most commonly used language for retrieving and organizing data from relational databases. It is the standard used for analytics platforms to desktop query tools like Excel®. Data access application programming interfaces (APIs) are also based on SQL. Applications access data using SQL through JDBC, ODBC, OLE DB, Microsoft .NET and more. And because APIs are one of the most powerful, agile and lightweight ways to innovate on web and mobile, they need to be a priority.
Most mainframe data stores are also non-relational, meaning data is not laid out in a common tabular format of columns and rows. Ironically, the flexibility and benefits of storing information in the non-relational format is being embraced again by newer databases such as MongoDB. But for now, to be used by modern apps and tools, non-relational databases still have to be translated into a table format that is readily understood by SQL.
Mainframe data is also unique in that it is protected with high levels of security because, as stated before, the transactions being managed by mainframe applications are of high value and core to the business.
Traditional approaches are not scalable
To meet the demands of mainframe data integration, IT staff would historically use FTP to manually transfer files to data warehouses, write manual scripts to feed applications the needed data or use point-to-point integrations that require frequent updates. With limited time and resources—compounded by the retirement of mainframe experts—organizations may also rely on multiple vendors to address the diversity of integration use cases, only adding complexity and cost to an already exhausting process. These approaches are not scalable to meet the demand for integrating increasingly diverse data sources.
Business and technology teams' priorities differ
There are stark differences in priorities between those who hold the data versus those who need the data. People who want to use the data focus on business impact and agility. They typically work with open systems, rely on SQL and have funding for insights, innovation and growth. The lines of business feel an urgency to move fast and get data where it needs to be to innovate quickly and beat the competition.
On the other hand, those who control mainframe systems are focused on business continuity and security. They know the mainframe systems and formats well but are typically short-staffed and face flat or declining budgets.
A solution that can overcome these differences so each can work together will prevent the rise of “shadow IT," where applications are acquired outside of IT channels, and will also reduce the workload on stretched mainframe IT staff.
Prepare for data integration success with a change in mindset
Unfortunately, the old strategy of building custom point-to-point integrations for each new requirement is doomed to fail. It consumes too many resources and takes too long. While consolidating your disparate information in a data warehouse may work well for some analytics, it won’t fulfill the need for real-time data, any time of day.
If IT teams do nothing to address the business demands for enterprise data or don’t deliver fast enough, data consumers will find ways to work around them. This will only compound integration challenges, causing even more work for departments already stretched thin.
In a worst-case scenario, someone with influence may start advocating to rip and replace your long-lived enterprise system, a process that can quickly take your organization down an incredibly costly rabbit hole.
A change in mindset can help you include more valuable mainframe data, more quickly, in your digital transformation and cloud analytics initiatives.
3 keys to maximizing mainframe data value
What if you had non-invasive integration options that require no new expertise and could quickly connect your enterprise data with new applications and tools, inside and outside your organization, on-premises or in the cloud?
What if you could quickly adopt new technologies and modernizing your business while maintaining business continuity by not disrupting what works?
If you accept that data silos are an unavoidable consequence of constant innovation, yet have tools to unite the differing sources, formats and semantics into a common language—you can break down the barriers to effectively harness data to make better decisions and serve customers.
What you need is a solution that provides:
Secure, real-time mainframe data access
In theory, having all of your data in one place is a great concept. It’s just not realistic as data volumes continue to grow. Depending on the data type and quantity, it’s being stored in the cloud, data lakes, SQL and non-SQL databases, and on-premises in distributed and mainframe systems. No matter where your data is stored or what format it’s in, to maximize its value, your analysts, decision makers and users must be able to access it.
Seek a data access solution that provides the breadth and depth of data connectivity for mainframe, mid-range, desktop and the latest cloud data platforms.
For hard-to-access enterprise data, find a solution that:
- Transforms mainframe data to be accessible and queried using SQL
- Does not impact existing applications
- Uses minimal mainframe resources
- Adapts to existing mainframe security policies
- Provides read and write capabilities to connect mainframe to cloud and back again
Minimize mainframe impact
Data access must not disrupt the processing speeds and performance of your applications or platform. When this happens, systems can’t scale with load, costs spiral and users flee. Customers who are made to wait—or worse yet, can’t access services during periods of peak load—will switch to your competitors without hesitation.
Data queries should be executed off the mainframe. This saves computational resources and costs. And it leaves your mainframe cool and efficient.
Low-latency data virtualization
Powerful magic happens when you can access data from multiple different databases and present it in a single virtual view as if it’s all living in a single database. That’s the gist of data virtualization. It’s a silo-killer. Suddenly, all your mainframe data, no matter the vendor, system or structure, is at your fingertips. And because you can visualize the data in one application or interface without actually having to move it, you get faster decisions without impacting your core systems.
Data virtualization is key to overcoming data silos and latency—the time lag between when an event occurs and the data describing that event is available for querying. Data virtualization brings the siloed data from multiple databases on a variety of platforms together into a single, comprehensible structure without altering the source structures. As a result, disparate data sources can be treated as one federated relational database, making real-time access with a single view a reality. This is ideal for powering dashboards, analytics tools and data-driven apps that need real-time data.
Uniform tables and columns are what make SQL databases so easy, consistent and reliable to work with. That’s why you need a clearly outlined and universal way to translate any data source from any database vendor into the uniform tables and columns that will help you bring your data alive. A metadata repository is key to translating diverse data to one common language.
An all-inclusive data virtualization solution should also include the often-overlooked mainframe and non-relational legacy data sources. COBOL applications and multiple legacy databases each need their own copybook translations to work with SQL. The best data integration solutions can retrieve meaningful names and format descriptions from these mainframe copybooks and apply them to the metadata repository.
Seek solutions that:
- Provide single virtual view of multiple data sources, without moving data
- Create a common understanding across silos via metadata unified into a data catalog
- Carry over security definitions plus extends to multiple access levels
- Allow users to explore data without committing to data movement
Data movement options with transformation and CDC
Once you’ve overcome the challenge of getting access to mainframe data in a format that is understandable, you may also want a solution that will help you move data to any destination. Having options on how you move your data can help you harness the value of data for advanced analytics, self-service data science, AI and machine learning or to modernize data stores for applications, data warehouses or platforms.
While most data integration solutions provide support for batch, Extract Transform Load (ETL) and Extract Load Transform (ELT), seek a solution that also provides Change Data Capture (CDC) capabilities. CDC is vital to keep your data movement efficient by incrementally updating only the records that have changed, reducing the burden on the performance of your mainframe systems.
Transforming data in flight beyond simple extract, transform, load (ETL) processes, both to change metadata and to support compatibility between cloud-based data warehouses, legacy and non-relational data sources, is highly desirable. Data transformation functions, occasionally referred to as processors, should enable complex transformations, including filtering, sorting, joins, unions, summarizations, enrichment, anonymization, and aggregations. For example, a solution that can mask mainframe data elements to ensure data privacy as the data is moved to a data warehouse for analysis is likely a requirement when dealing with the sensitive nature of mainframe data.
Synchronize efficiently with transformation settings that give full control of scheduling incremental or full data movements, launching pre- or post-sync tasks, and setting up automatic email event notifications. You should be able to scale your synchronization tasks based on demand, such as scheduling updates for low-use periods, or as frequently as every minute when your business requires more up-to-date information.
Seek a mainframe data integration solution that:
- Provides a selection of data movement options to fit a variety of use cases
- Captures changes from single or multiple data sources (change data capture)
- Includes a robust set of pre-built transformation/processing functions
- Provides a single, design experience
Be the hero: Innovate while maintaining business continuity
It’s natural to want to tackle projects in a clean, linear fashion to ensure everything works together seamlessly. Maybe you even think it’s simpler to just put all your data and applications on the same platform and standardize on a single technology stack. But all too often, the world doesn’t work that way, especially for established enterprises that need operational continuity while concurrently innovating to stay ahead of the competition. Fortunately, with the right data integration solution, you can do both—innovate and maintain business continuity.
Mainframe data integration powers digital innovation
Meet Software AG’s CONNX mainframe data integration solution. With CONNX, you can easily access, virtualize and move your data—wherever it is, however it’s structured—without changing your core systems.
CONNX offers connectors to the most complex, hard to reach data sources and platforms—including mainframe, OpenVMS, IBM i (AS/400), UNIX®, Linux®, Windows® and cloud—using common connectivity mechanisms such as ODBC, JDBC, .NET, J2EE and OLE DB. This lets you translate all your data sources, including non-relational databases, to SQL standards, while keeping your existing security measures intact.
Why does this matter? Now you can quickly leverage modern apps built in Java, Python, Visual Basic, C, C++, PHP, .NET or other web tools and provide easy access to VSAM, IMS, Adabas, Db2, Datacom and other legacy and non-relational data through the CONNX DB Adapters. This allows you to fuel digital innovation while leaving your original data intact and existing operations undisturbed.
With CONNX’s powerful change data capture (CDC) technology, event-driven data can be captured, transformed, and moved incrementally. By updating only the records that have changed, you can keep your data fresh and current without impacting your mainframe performance thus ensuring business continuity.
Meet the demand to connect mainframe data, fast and easily to:
- Support digital and cloud transformation
- Power data-driven web, mobile, cloud, and SaaS apps
- Enable multi-channel engagements on web, cloud, mobile, smart devices and APIs
- Modernize applications or data warehouses
Mainframe data integration for cloud analytics
If you are ready to leverage your rich mainframe data in your cloud analytic platforms, look no further than CONNX.CONNX handles acquiring, securing, and providing access to mainframe and legacy data, then it creates a bridge from that data to the cloud and readies it for analytics.
With CONNX, analysts can easily access existing data sources using Structured Query Language (SQL), the lingua franca of analytic and business intelligence tools. With CONNX’s built-in metadata model, you can properly prepare data so that it is meaningful for analysis or application use. By placing a relational model on legacy data, the source data is abstracted into a common framework easily understood by analysts, developers and common apps and tools. This powerful metadata model also opens the door for data virtualization—allowing you to access data from multiple different databases and present it in a single virtual view as if it’s all living in a single database.
Sometimes just accessing the data isn’t enough, you may want to push your data to third-party databases, cloud data lakes, data streaming, or messaging systems. CONNX’s transformation settings give you full control of scheduling incremental or full data movements, launching pre- or post-sync tasks, and setting up automatic email event notifications. You can scale your synchronization tasks based on demand, such as scheduling updates for low-use periods, or as frequently as every minute when your business requires more up-to-date information.
With CONNX data integration, you can now include mainframe data in your cloud analytics efforts and build data pipelines to your most valuable data. The depth of insights gained from including mainframe data can be game changing:
- Improve customer interactions and offer new services
- Spot patterns to improve marketing and sales to increase revenue
- Improve operational planning and resource management to save money
- Uncover hidden business risks and counter fraud
Exploit the value of your mainframe data
Your mainframe’s core applications, their tailored business logic and data already differentiate you from your competitors and act as a reliable backbone for your business operation. By making this “known” data more broadly accessible and easily integrated with other apps and platforms, you can quickly meet the demands of the digital enterprise today, tomorrow and into the future.
Are you ready to cross the chasm between the new and the old? Let’s make the leap together.
Explore mainframe data integration in action
Beating the startups at their own game
Mainframe access from web and mobile apps in a click
One of the world’s largest vacation ownership and exchange companies helps families to find the perfect resort, rental, or experience in over 110 countries. Its team of 15,000 associates helps guests make memories at over 200 vacation clubs and 4,200 affiliated exchange properties around the world. That success depends on a custom IBM z Systems solution that offers unparalleled reliability and uptime but makes it truly challenging to give customers multi-channel access to vacation properties.
- The challenge
Customers expected fast access to vacation properties through web and mobile interfaces. They were not going to wait for a call center when digital-native competitors put everything at their fingertips. The company needed to deliver instant multi-channel customer services without wasting time and resources replacing its solid mainframe application. - Enabling a multi-channel experience with data integration
The key goals for the vacation company were keeping the advantages of its mainframe system while expanding to deliver instant, scalable web and mobile access. Its property booking and sales system was originally built on a VSAM backend system—which never went down. The data integration package they chose, CONNX, was the only solution that could connect the mainframe data to the web apps.
By translating the company’s existing VSAM data for bookings into SQL and presenting it in a relational format, CONNX enabled the web and mobile applications to view booking data and provide a real-time response to users. SQL is the standard used by web and mobile applications to communicate directly with a database, simplifying accessibility.
With direct database access, their data integration choice helps keep the vacation company’s customers online through booking completion. - New revenue, unlimited capacity, and any future technology
Unleashing the power of the mainframe for the web, mobile and more has paid dividends for the business. The move has brought in new revenue, and the ability to scale capacity to connections means the enterprise is never caught off guard.
Rebuilding cities: Bringing the app to the mainframe for modern interactions
- E-Government Services
One of the world’s largest cities provides access to all government services and information through a centralized service. It is fast, free, and easy, available in over 100 translatable languages online and mobile apps and 24/7 via telephone. But to power tens of millions of interactions per year, the service must integrate data from over 120 agencies, officers, and organizations across the city. - The challenge
The only way the city could bring its e-government services to the web and mobile without mainframe access technology was with a full rewrite to a new platform. Based on similar projects, the city realized that path would have cost hundreds of millions of dollars—and risk falling short. Historically, such municipal rewrites have a failure rate of over 50 percent. - Unlocking one-touch government services
Project leaders realized they could stick with all the advantages of a robust mainframe while their data integration solution choice, CONNX, brought digitized services to any platform – from the web to tablet to mobile. That provided a safer, faster, lower-cost, and more secure way forward. Best of all, setup and implementation promised to be fast, cheap, and successful.
Now, the metropolis continues to use Adabas and VSAM on the backend, and its other powerful custom applications, to deliver services throughout the city. While these systems stay in place, the city can offer modern iOS and Android apps that give residents zip code- and address-based individual access to city services. - Ready for IoT (and anything else) in a day
The system has already saved the city millions of dollars in both setup costs and ongoing operations. Automating bill payment and other services add force multipliers such as timelier delivery of services, fewer errors, and faster turnaround times. And it eliminates the drudgery of menial data entry by city employees. If the city wants to implement a new service, it can write a simple query of about a dozen lines and finish testing in less than a day. That means the city can go from idea to implementation in as little as a month, limited only by the process of studying the impact of the new service.
Tackling an ‘unsolvable’ data migration problem to enable a billion-dollar deal
- Bank divests
One of the largest banks in the world serves over 9 million customers and maintains nearly 1,000 locations. Recently, the company wanted to sell a majority stake of its insurance arm, which required the creation of a data warehouse to share with the acquiring company. - The challenge
Segregating and moving data from the parts of its business it was selling appeared to be an unsolvable problem. An analysis by Accenture determined that using custom extraction scripts to separate data would result in missing the transaction deadline by years. Factors complicating the sale included the diversity of the company’s 11 legacy systems it had to access, the depth and complexity of the data involved, and the targets for transformation. - Unmatched speed to build a data warehouse
The bank was shocked when it discovered that CONNX could do everything it wanted and more. Almost from the start, CONNX proved its value. The data migration project kicked off as a total success—hitting production, and flying ahead to save the deal from failure. And with Software AG at its side, the bank worked fast, light, and at low cost, without compromising on security.
The bank had systems that were up to 35 years old, data reaching back further than that, and serious security compliance and customer data considerations it could not ignore. That’s why when it had tried to write custom extraction scripts—for everything from VSAM, QSAM, Adabas, and IMS, to Siebel and even SUPRA—failure looked inevitable.
CONNX suited the bank’s needs perfectly, helping it build a data warehouse to help with the data migration project and support future internal projects.
CONNX touches 40 systems containing policy, client, process and advisor information. Working on both structured and unstructured data, CONNX is not only used for access, but also for extraction, transformation, and loading (ETL) to the target Microsoft SQL Server.
While the bank was surprised that any solution existed at all, it was downright floored by the speed of extraction and run times, completing some of the necessary migrations in as quickly as a weekend. - Come for the data warehouse, stay for the cloud
With the sale complete, the bank realized that CONNX adds value that goes far past this one project. The solution could be used to open the doors to new profits, new security features and new technologies.
One of these technologies is the cloud. While the bank has enough on-premises hardware that processing power isn’t a concern, it is interested in how cloud-based data lakes like Snowflake can future-proof its business with improved data accessibility, redundancy and security.
The business plans to implement CONNX to support safer sharing of data with external parties, and improved layering of firewalls. Sticking with CONNX will help this bank thrive for the next 150 years—and beyond.
Winning wealth management customers through greater trust
- Global Bank
One of the world’s largest banks has over 80,000 employees serving 17 million customers across 34 countries around the globe. It specializes in high-net-worth individuals and families, institutions and charitable organizations. This corner of the financial world lives or dies by the authority of information, exclusive and up-to-date analysis of trends and market movements—and above all, customer trust in the expertise of advisors. Too often, however, the bank’s traditional data warehouse provided out-of-date information during planned and unplanned customer calls—resulting in loss of expert authority and trust. - The challenge
As the wealth management market has sped up, data has proliferated. Customers expect their advisor to have instant, reliable access to their information—which means data pulled from numerous sources, presented in a virtualized, accessible view. Without this capability, the bank’s advisors might refer to an outdated report or take too long to get data, causing the customer to take their business elsewhere. Doing nothing was not an option. But the solutions offered by Oracle, TIBCO and iWay all had critical downsides, from requiring too many components, to a constant need for re-writes and re-linking data, to vendor lock-in. - Real-time data access and virtualization provides customer 360 view
CONNX offered the bank seamless, powerful and quick access to virtualize data across multiple legacy systems. The bank’s advisers gained total insight into everything a caller wanted to know—in real-time, in simple dashboard views. And since the bank was already using Adabas, adding CONNX meant getting a complete solution from a single vender.
The bank built a virtual data warehouse that can access and virtualize data, and often even transform it, across disparate legacy systems like Adabas, Sybase, AS400, Oracle, and beyond—all in real-time.
This capability offers a competitive differentiation, including the ability to mine for the right information, gather data and perform data transformation to fix errors on the fly. For instance, if the date of a particular customer’s stock trade existed in different formats across different data sources, CONNX can mine for the date in question, no matter how the date is formatted. - Data-driven decisions boost customer satisfaction and growth
For the bank and its customers, the virtual data warehouse has represented a total turn around. Now that managers had a complete overview of a customer’s portfolio at their fingertips in real time, customer trust soared. And as word spread, the rising tide lifted the bank and the wealth it managed for individuals, institutions, and non-profit charities alike.
Fresh data, smiling customers and powerfully effective and direct dashboarding for wealth managers means CONNX helped raise a customer’s reputation to the next level. With the ability to perform live pattern analysis and forecasting, achieve consistent action for data-driven decisions and boost industry leading security, the bank is poised for success in the years to come.