MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG)


Free download. Book file PDF easily for everyone and every device. You can download and read online MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG) file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG) book. Happy reading MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG) Bookeveryone. Download file Free Book PDF MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG) at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG) Pocket Guide.
Draft Reports

The ISO method starts with the modeling of a particular business process along with details of relationships and interactions between the actors, and the information that they must share to execute the process also are identified. The output is subsequently organized into a formal business model using UML Unified Modeling language.

ISO describes a Metadata Repository containing descriptions of messages and business processes, and a maintenance process for the Repository Content. The Repository contains a large amount of financial services metadata that has been shared and being standardized across the industry. It is reported to be a standard of the securities front office. Many instructions relating to interest, trade instructions, executions etc. FIX supports equities, fixed income, options, futures, and FX. In , FIXML was optimized to greatly reduce message size to meet the requirements for listed derivatives clearing.

As the business information is made machine readable through the tagging process, this information does not need to be manually entered again and can be transmitted and processed by computers and software applications.

Presenters « Data Management Association

The use of XBRL has expanded into the financial transaction processing area also in recent years. It uses the XML syntax and was specifically developed to describe the often complicated contracts that form the base of financial derivative products. It is widely used between broker-dealers and other securities industry players to exchange information on Swaps, CDOs, etc.


  • Description:;
  • Redbook - Master Data Management with Infosphere | Implementation | Conceptual Model;
  • One Tough Ombre.
  • Shepherding Shepherds;

Each of the elements in the corporate actions taxonomy corresponds to a message element in the ISO Corporate Actions Notification message. This initiative attempts to bring together a XBRL standard with the standard used by financial intermediaries to announce and process corporate actions events. This type of messages are exchanged daily between issuing and acquiring banks. The financial crisis brought to fore the fact that this is one of the most neglected areas in the financial services industry.

In respect of investment area, reference data define the financial instruments that are used in the financial markets.

Redbook - Master Data Management with Infosphere

It also provides support for maintenance and distribution of reference data using FIX messages. As a result of the integration of ISO into ISO , the financial messaging standard was expanded to be a business model of reference data for the financial markets in addition to the messages. Thus, XBRL may be the reporting standard for all regulatory reporting of structured data. Chapter IV — Issues in data management and data quality in banks.

With the advent of technology in banking, huge volume of data is produced and stored digitally. The ability of organizations to capture, manage, preserve and deliver the right information at the right time to the required personnel is one of the key success factors of Information Management. This chapter highlights various issues relating to data quality and challenges faced by banks in regard to data management and data quality. IEB was aimed to provide an overview of Information Management landscape across small and medium private sector banks. Banks tend to collect information across multiple locations and multiple formats, thus potentially creating non standardized data.

Standardization across business eliminates data duplication, data redundancy and cost associated with resolving these issues. All surveyed banks have standardized reporting formats at the local branch, region, zone and at the corporate level. Banks faced challenges in standardizing the report generation methodology despite having a standardized reporting form and clearly defined output.

Banks use different technologies and databases to capture and store information. Key observation was that standardization is needed at Extract, Transform and Load ETL process which would result in process efficiency, reduced manual intervention and reduced costs. All these factors could potentially impact business aspects like capital management and capital ratios, asset quality monitoring, funds and liquidity management ultimately impacting effective risk management.

All these aspects indicate need for enhanced processes and procedures for data and information management in the bank and need for robust and standardized metadata. Committee identified various data related aspects relating in specific to credit area that would need to addressed to facilitate standardization across banking system. These are detailed at Annex III.

Apart from the basic approaches for handling major risk categories, Basel II further entails progressive advancement to sophisticated but complex risk measurement and management approaches to credit, market and operational risks depending on the size, sophistication and complexity of the respective banks. There are various direct and related components of the Basel III framework like increasing quality and quantity of capital, enhancing liquidity risk management framework, leverage ratio, incentives for banks to clear standardised OTC derivatives contracts through qualified central counterparties, regulatory prescription for Domestic Systemically Important Banks and Countercyclical Capital buffer CCCB framework.

The Framework lays down guidelines for early recognition of financial distress, information sharing among lenders and co-ordinated steps for prompt resolution and fair recovery for lenders. The challenge for banks will be to develop new products and delivery channels that meet the evolving needs and expectations of its customers.

Thus, there is a need for effective information management practices and robust MIS. This calls for a robust data governance framework in banks. Chapter V — Data governance architecture in banks. The issues highlighted in previous chapter call for a robust data governance or information management architecture in banks. Specific focus need to be accorded to data quality through various mission mode strategies and as an ongoing exercise.

This chapter delineates the recommendations of the committee on key components of robust data governance architecture in banks and key practices to address data quality issues.


  • Business Metadata: Capturing Enterprise Knowledge?
  • OCDQ Blog — OCDQ Blog.
  • Data Catalog Specifications.

Bob Seiner states in his book Non-Invasive Data Governance, that Data Governance is the formal execution and enforcement of authority over the management of data and data related assets. In this case, data governance refers to administering or formalizing, discipline behavior around management of data. According to the Data Governance Institute, a well-defined and complete data governance solution is a system of decision rights and accountabilities for information-related processes, executed according to agreed-upon models which describe who can take what actions with what information, and when, under what circumstances, and using what methods.

There are other views or definitions of data governance. Thus, data governance broadly refers to the policies, standards, guidelines, business rules, organizational structures in respect of data related processes performed in an organization. In the aftermath of financial crisis, the Senior Supervisors Group had indicated following key pre-requisites for implementing comprehensive risk data infrastructure:. Committee recommends that key components of data governance architecture in banks may incorporate focus on the following:.

Ideally, data ownership needs to be primarily based on the business function. Data Profiling : is a systematic exercise to gather actionable and measurable information about the quality of data. Typical Data profiling statistics include specific statistical parameters such as - Number of nulls, Outliers, Number of data items violating the data-type, Number of distinct values stored, Distribution patterns for the data, etc.

Information gathered from data profiling determines the overall health of the data and indicates the data elements which require immediate attention. Data Cleansing : process may be enabled for detecting and correcting erroneous data and data anomalies prior to loading data in the repository. Data cleansing may take place in real-time using automated tools or in batch as part of a periodic data cleansing initiative. Data Cleansing is done by applying pre-defined business rules and patterns to the data. A rigorous data monitoring procedure may be enabled at banks to handle the monitoring of data quality.

Based on the data monitoring reports, corrective actions will be taken to cleanse the data. Implement solutions that address the root causes of the data quality problems. Monitor and verify the improvements that were implemented. Maintain improved results by standardizing, documenting, and continuously monitoring successful improvements. This chapter delineates the efforts of earlier initiatives, recommendations of various Committees and provides major recommendations in enabling standardization of data pertaining to commercial banks, NBFCs and UCBs submitted to RBI.

Latest Articles

The Reserve Bank of India RBI collects large volume of data from banks for its key functions like monetary policy formulation, supervision, regulation and promoting research. This pool of banking data spans over various dimensions and granularities with different structures, formats, naming conventions, levels of aggregation and frequencies. However, at times, these data lack internal consistency which could potentially impact their utility as policy input.

Further, similarities in data elements in multiple returns gave rise to problem of inconsistency. Therefore, need was felt to develop harmonised and integrated system of reporting banking data to the RBI by re-examining the entire gamut of the definitions, classification and coding structure of data.

Further, the form of these data can be classified as either structured e. Various departments in RBI have prescribed fixed-format returns for specific purposes. While each of the above returns has some distinct features, there are some common data elements among them. Several attempts were made earlier to rationalize return submission. In , DSIM undertook an exercise in which out of the returns in existence, 76 returns were proposed to be discontinued.

As a follow up, 39 returns could finally be discontinued.


  1. Description?
  2. The Dreamer!
  3. Running Fast and Loose.
  4. In order to streamline the process, initiative has been taken in the recent past to store data received through XBRL platform in a centralized database which can be accessed by multiple users. To know the extent of data mismatch, the Group studied data from six such returns viz. Based on the same, the Group gave its observations with respect to items which require harmonization in respect of certain asset and liability items.

    It also provided common definitions for key information blocks. Risk to financial stability from the sector emanates from these inter-linkages between NBFCs and other financial intermediaries and their funding dependencies. Accordingly, the regulatory guidelines are tuned towards discouraging a higher degree of leverage and having adequate capital buffers so as to ensure that any stress on their balance sheets is absorbed rather than transmitted to the financial system. From the standpoint of fi nancial stability, this segment of NBFCs assumes importance given that it holds linkages with the rest of the financial system.

    Responding To A Promotion?

    Detailed instructions regarding submission of returns by NBFCs have been issued through various company circulars. NBS-7A Quarterly statement of capital funds, risk weighted assets, risk asset ratio etc. UCBs are primarily classified as scheduled or non-scheduled.

    Following consolidation, the number of UCBs came down marginally to 1, in from over 1,a year ago. Hence, as part of data standardisation, data collected by other returns may also be brought in alignment with the usage of common or standardised codes used in BSR.

    1st Edition

    A suitable data model may be generated to facilitate element-based, simplified and standardised process data collection process by RBI under a generic model structure that is suitable for both primary and secondary data. A sub-group constituted by the earlier Mohanty Committee had identified a number of elements that should form the basis for rationalisation. In due course, after rationalisation exercise, data element based return submission may also be initiated. This chapter reviews the current status on ADF implementation in commercial banks and provides recommendations in this regard.

    Banks that have followed this vision in developing their central repositories will find it easier to migrate to the element-based data reporting paradigm. Banks that have followed a return-based approach would require some changes to their systems. In terms of the approach paper referred to above, the RGG consist members from the IT Department and Compliance Department and some of the key business groups that submit important regulatory returns.

    In view of the guidelines set-out in the Approach Paper, the banks have set-up processes to ensure compliance under ADF and the approach generally adopted by the banks is as follows:. In this regard, banks may also explore using the platform for generating internal MIS and other uses.

    MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG) MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG)
    MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG) MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG)
    MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG) MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG)
    MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG) MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG)
    MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG) MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG)
    MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG) MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG)
    MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG) MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG)
    MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG) MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG)

Related MASTER DATA MANAGEMENT AND DATA GOVERNANCE, 2/E (Database & ERP - OMG)



Copyright 2019 - All Right Reserved