Terminology & Acronyms
We use the following terms throughout this document.
The autonomous or semi-autonomous examination of data or content using sophisticated techniques and tools to discover deeper insights, make predictions, or generate recommendations. Advanced analytic techniques include those such as data/text mining, machine learning, pattern matching, forecasting, visualization, semantic analysis, sentiment analysis, network and cluster analysis, multivariate statistics, graph analysis, simulation, complex event processing, neural networks.
Data, typically numerical, that is collected from multiple sources and/or on multiple measures, variables and combined into data summaries, typically for the purposes of public reporting or statistical analysis.
The theory and development of computer systems able to perform tasks that traditionally have required human intelligence.
A data attribute is a single field representing a certain feature, characteristic, or dimensions of a data object.
Refers to a semantic data model which is a method of organising data that reflects the basic meaning of data items and the relationships among them. This organization makes it easier to develop application programs and to maintain the consistency of data when it is updated.
Data standards are the rules by which data are described and recorded in a consistent way. In order to share, exchange, and understand data, the format and meaning must be standardised.
In finance, exposure refers to the amount of money that an investor has invested in a particular asset and also represents the amount of money that the investor could potentially lose on an investment.
The data points and data formats which firms use in their internal books and records for financial and business purposes. Data defined at lowest appropriate level possible for a given data set, for example relating to individual contracts or transactions.
A method of designing a sequence of actions to solve a problem that optimise automatically through experience and with limited or no human intervention.
Open source describes software that comes with permission to use, copy and distribute, either as is or with modifications, and that may be offered either free or with a charge. The source code must be made available.
The use and examination of data to predict patterns of activity. Predictive analytics may involve technologies such as machine learning or visualisation tools and is characterised by techniques such as regression analysis, forecasting, multivariate statistics, pattern matching, predictive modelling, and forecasting.
Proof of Concept
A proof of concept (POC) is a demonstration of a product, service or solution in a sales context. A POC should demonstrate that the product or concept will fulfill customer requirements while also providing a compelling business case for adoption.
For this POC, we refer to reference implementation as a point of reference rather than being put to directly productive use.
Data received to fulfill mandated functions such as supervisor, regulator, macro prudential authority and Resolution authority.
The description of which firms need to provide data, what data they need to provide, how they need to provide it and when they need to provide it. Requirements may include rules, instructions and technical specifications.
Information that has a pre-defined data model or is organised in a predefined manner.
Any application of financial technology used by regulatory, supervisory and oversight authorities.
A list of all the technology services used to build and run one single application.
Information that either does not have a pre-defined data model or is not organised in a pre-defined manner.
Bank for International Settlements Innovation Hub
Bank of England
Common Domain Model
Financial Products Markup Language
International Swaps and Derivatives Association
Monetary Authority of Singapore
Proof of Concept