Recent Discussions
What is Entity Quality?
Most organizations are concerned with data quality. What's usually neglected is the Entity Quality. But what exactly is Entity Quality? Entity Quality, or EQ for short, is a way to determine the quality of a resolved entity. Say we get a resolved entity based on matching 3 different records, based on some matching logic. The question that arises here is: do these 3 records actually represent the same real-world thing/entity? How confident are we that they do? Entity Quality is about that. It's a way to give end users confidence, via a calculated/aggregated score. At Quantexa, we provide Entity Quality Scoring, or EQS, a way to provide confidence scores for resolved entities. It determines over/under linking in entities. Check out a nice short blog on the entity quality overlinking tool that has been developed based on this functionality. That said, this functionality is embedded in the product, and users can view the generated scores easily, and investigate any entity (if/when needed). Do you measure entity quality at your organization? If yes, how? If no, do you think this can be useful? Please share your thoughts!Issam_Hijazi2 years agoQuantexa Team911Views1like0CommentsEntity Quality Management for Data Quality?
Data quality is one of the major challenges that organizations suffer from, consuming up to 50% of data users time, costing up to 30% loss in revenue and destroying trust in data. In today's data-dependent business environment, the urgency for impeccable data quality is driven by a confluence of market forces. Fears of reputational damage due to data mishaps, the competitive pressure from digital-first competition, and an array of stringent regulations are pushing companies to reassess their data management strategies. Quantexa's Entity Quality Management (EQM) solution enhances the efficiency and effectiveness of data remediation by tailoring campaigns to align with evolving business needs. It directs remediation activities through the efficient remediation channels, from hands-on steward workflows to automated prompts in customer portals and CRM systems. Validated in production environments, Quantexa's EQM solution has integrated more than 80 data sources, achieved a 9% rate of deduplication, and uncovered 2.5 million matches that competing solutions overlooked. Simultaneously, it has effectively identified cases of overmatching and facilitated the prioritization and support of 1,000 merge reviews on a weekly basis. How does it work? Quantexa's Entity Quality Management solution seamlessly ingests a wide array of data, irrespective of its origin, embracing both internal and external sources with a model-agnostic approach that ensures rapid integration. It employs sophisticated AI/ML models to automatically parse, cleanse, and standardize data, laying the groundwork for a consistent and reliable data foundation. Source records are then transformed into an accurate single entity view, thanks to Quantexa's scalable and highly-accurate entity resolution capability. The solution assesses each record, applying a set of rules designed to identify and rectify any anomalies, such as historical overlinking, underlining, potential duplicates, and various attribute inconsistencies, enriching the data's integrity and reducing false positives. The user experience is elevated through interactive dashboards that provide dynamic visualizations, breaking down data by various quality indicators, technical specifications, and business attributes, thus enabling a thorough understanding of data quality and the focused identification of areas for remediation. By leveraging Quantexa’s EQM solution, organizations achieve situational awareness of data quality across the enterprise, enabling a focused approach to remediation. As a result, businesses can prioritize their data quality efforts efficiently, targeting areas of most significant impact and aligning remediation tasks with current business priorities and operational demands. Integration with key enterprise services facilitates a seamless and effective data quality management process.Issam_Hijazi2 years agoQuantexa Team191Views1like0CommentsExploring the Challenges of Achieving a Single View in Datawarehousing
When it comes to achieving a single view of individuals or businesses in Datawarehousing, several key insights emerge: 1️⃣ Data Integration: Integration is a critical aspect. Organizations often struggle with merging data from disparate sources such as customer databases, transaction systems, and marketing platforms. Ensuring seamless data integration is essential for a unified view. 2️⃣ Data Quality: Data quality plays a vital role in establishing a reliable single view. Inaccurate, incomplete, or inconsistent data can hinder decision-making and analysis. Implementing data cleansing processes and validation mechanisms are crucial steps towards maintaining high-quality data. 3️⃣ Data Silos: Data silos, where information is isolated within different systems or departments, pose a significant challenge. Overcoming these silos requires breaking down barriers, implementing data governance practices, and establishing data sharing mechanisms. 4️⃣ Business Context: Contextual understanding is crucial for creating a comprehensive view. Data needs to be interpreted within the specific business context to derive meaningful insights. Adapting to evolving business requirements and aligning data consolidation efforts accordingly is vital. Questions I always have and continue to ask are: 🔸 How do you address the complexities of data integration when combining data from diverse sources? 🔸 What approaches have you found effective in ensuring data quality throughout the process? 🔸 Have you encountered challenges in breaking down data silos? How did you overcome them? 🔸 How do you incorporate the business context into your data consolidation efforts? 🔸 Are there any specific tools or technologies you recommend for achieving a single view in Datawarehousing? Please share your thoughts and let's learn from one another!151Views1like0Comments📣Master Data Management UI Demo & Feedback Sessions
In the coming weeks, we'll be running demo & feedback sessions on our new Master Data Management UI. You'll get to see an exclusive demo of our new UI and provide your feedback and insights in a small focus group setting. Available Sessions (login required): Session 1: Thursday, June 13th (3:30 PM - 4:15 PM BST) https://community.quantexa.com/events/157 Session two: Wednesday, June 19th (9:15 AM - 10:00 AM BST) https://community.quantexa.com/events/158-exclusive-master-data-management-ui-demo-feedback-session-2 Session three: Tuesday, June 25th (3:00 PM - 3:45 PM BST) https://community.quantexa.com/events/159 Don't forget to subscribe to the User Research Panel page to stay up to date with other research opportunities! Please note these sessions are open to our customers & partners only.Stephanie_Richardson12 months agoQuantexa Team121Views0likes1CommentAug 3 Webinar: The Power of “Contextually Connected Entities” for Maximising Decision Intelligence
As defined by Gartner, Decision Intelligence improves decision-making by understanding and engineering how decisions are made, and how outcomes are evaluated, managed, and improved by feedback. Using artificial intelligence, Decision Intelligence unlocks the power of entity resolution and graph analytics to turn data into accurate decisions at scale. This means that data becomes more than the sum of its parts– not piecemeal, siloed, and afflicted with poor quality. Decision Intelligence connects all data from previously siloed and scattered points and creates as single trusted, and reusable resource. Join me and Melissa Ferrer, Partner, Data & AI, at Deloitte in a Live webinar as we delve deeper in to Decision Intelligence, and the importance of Contextually Connected Entities in having informed decisions. Event page on the community: Webinar: The Power of “Contextually Connected Entities” for Maximising Decision Intelligence The business world is more data-driven than ever, but making sense of the vast amounts of data available can be a daunting task. Many businesses struggle to bridge the gap between data and decision making, and often lack a comprehensive view of their customers. OR direct event landing page: https://info.quantexa.com/decision-intelligence-webinar-apac?hs_preview=Novfdgge-122643133495Issam_Hijazi2 years agoQuantexa Team111Views1like0CommentsTo use a Graph DB or not?
Graph databases (Graph DB) are gaining popularity as an alternative to traditional relational databases due to their ability to manage highly interconnected data. However, the decision of whether or not to use a Graph DB is dependent on several factors and is not straightforward. Some users of Graph DB technology have expressed frustration due to a lack of tangible business outcomes after investing in projects for two years or more. They are now questioning the need to continue using the technology and paying for licenses and maintenance. Some of the most complained about aspects were performance issues, complexity and flexibility. It is essential to note that Graph DB technology is not a "magic stick" and requires significant pre-work to create meaningful connections between data. Despite its advantages in managing complex data relationships, Graph DB is a data store after all, and has some shortcomings. If you have experience using Graph DB , I would love to hear about your experience and what use cases you've use/ed it for. Cheers101Views0likes0CommentsRecording for The Power of “Contextually Connected Entities” for Maximising Decision Intelligence
Hi everyone! If you've missed the webinar on The Power of “Contextually Connected Entities” for Maximising Decision Intelligence, you can now watch it on demand using the below link! https://info.quantexa.com/decision-intelligence-webinar-apac-deloitte We're keen to hear your thoughts on the topic, so please leave it below! CheersIssam_Hijazi2 years agoQuantexa Team81Views1like0CommentsThe Inherent Problems of MDM
The aim to have one master record for each real world entity has always been hard to achieve—and it’s becoming even harder. Companies are contending with: Multiple internal applications—many of which will contain different versions of the same master data record Numerous external data sources that provide additional—sometimes contradicting—information about companies and individuals Bringing together all these views of master data is incredibly difficult, because of: The data quality problem Traditional MDM has an inherent data quality problem. And it affects your decision making, regulatory compliance, business effectiveness and efficiency. Traditional MDM solutions don’t focus on solving data quality issues. In fact, they tend to fail at the first hurdle of matching data—because they struggle to join data from across disparate data sources such as multiple internal applications. When you add the volume and variety of data from external sources it’s even further beyond their capability. Instead, they use roughly the same matching algorithms they’ve used for the past 20+ years, which relies on record-to-record comparison that is very fragile when key attributes are missing or different. And that’s a problem. Because crucial information goes unreported when your MDM solution can’t catch essential links between data, and obscure relationships and connections are often overlooked. It also makes your MDM implementation very high risk. Bad data quality means: Data remains trapped in silos and is duplicated across channels Master records aren’t accurate and true-to-life Bad decisions are made, due to the fact they’re based off incorrect or delayed data Key information and critical links go unnoticed Opportunities are missed The transformation challenge MDM is both a business and a technology transformation challenge. You start out with multiple users updating the master records in separate applications, all with their own ways of working. Inevitably, it’s messy, it’s haphazard and it results in duplicate records, inconsistencies and confusion. So, to combat this, you decide to standardize things. Everyone is to use just one application, with a standardized set of rules on how to input data, how data should be formatted, what records look like and more. It’s a good idea—in theory. But the problem is that the real world rarely plays out quite so neatly. So what you end up with is this: A logistical nightmare, as you try to migrate decades of data and make it conform to your ideal record format. Confused and frustrated users who need to be transitioned to the new service—with all the training and business change support that entails. Backwards-compatibility issues, as information moves to new places and takes on different—unrecognizable—formats, meaning users and business applications can no longer find information. Challenges updating future records. If your records only track A, B and C, what happens when users later need to add D and E? Is it updated across the entire data store? And what happens to data that is initially discarded for non-conformity, but is later needed? The varying needs of different data consumers For an MDM initiative to be considered successful it needs to be able to serve data to consumers across the organization. For instance: Analytics teams who need to link data sources for decision intelligence Fraud monitoring applications Customer services or relationship managers who need rich views of their customers Finance teams who need to aggregate risk reporting Different business units often have different views on the data they require. So, when an MDM initiative attempts to standardize master data attributes across the organization, it may mean dropping attributes that these data consumers rely on—which, naturally, can result in tension and impact business performance. MDM needs to be able to present a rich and deep view of data to areas of the organization that consume it, while on the journey to standardizing key attributes. But that often runs counter to the way a lot of existing MDM software works, which relies on a fixed view of data that needs to be adhered to from day one. The need for governance and control In large organizations there are often many applications that hold customer data—each controlled by different business units. And that means a wide range of stakeholders with different priorities. Implementing traditional MDM often requires each business unit to give up control of their applications and data to a central initiative—which can result in a great deal of pushback and agitation. To successfully implement an MDM initiative, you’ll need to be ready to address the political challenges that come with it. A lot of that relies on bringing people together around a vision of a service that will benefit them, within a transformation program that can actually deliver. Building a Single, Golden Point of Truth The key to resolving the traditional MDM data quality issue lies in powerful entity resolution that retains context and doesn’t force data into a standardized format. We call this contextual MDM. How contextual MDM stands apart Originally built to tackle financial crime, contextual MDM (cMDM) ingests data from both internal and external sources to build an accurate, connected and enriched single-entity view using entity resolution and network generation technology. This is different from traditional MDM solutions, which rely on record-to-record matching—a method that does not work well on disparate records, as it relies on many attributes matching. At Quantexa, we use an expanded range of data—including address, phone, email, country, and third party data—to make further connections and enrich your data. Which is why our solution can make connections between records even when data quality is poor. Play video Traditional MDM Traditional matching does not work well on sparsely populated records—because it relies on many attributes matching. Records can only be accurately linked if a number of fields match (for example, if two records have the same name, date of birth, and address on file). The lack of additional contextual data makes deduplication difficult and leaves questions unanswered. Traditional MDM also relies on you to set rules. If the rules are too rigid, records will be under-linked (meaning duplication is more likely to go uncaught). If the rules are too loose, records will be over-linked (meaning different records are more likely to be mistakenly deduplicated, even when the entities in question are different). Quantexa cMDM With our entity resolution software, connections can be made intelligently across records. Using additional fields and an expanded range of records from any number of internal and/or external sources, our software makes it possible to accurately determine when multiple records exist of a single entity—and to turn duplicated records into a single, enriched entity view. Resolved entities can also be seen in context with their networks—so you can see how different entities relate to each other. With cMDM, data across different records is iteratively updated to enrich all sources, leading to better match rates and higher quality records data. The Benefits of Contextual Master Data Management With contextual MDM, you gain: A single, complete view of connected data A foundation for trusted data Flexible and open architecture Low risk implementation The power to make better decisions Consumer oriented views of master data So you can Create and update accurate master records, in real time Spot hidden risks and identify high-value growth opportunities Share essential data among your teams Offer frictionless digital-first experiences for your customers Develop good data practices and upkeep organizational data hygiene Scale your business easilyLorena_Seco3 years agoQuantexa Team71Views0likes1CommentWhat are the limitations of standard MDM?
I've been discussing this with a colleague of mine earlier today, and I came up with the following list: Increased interest of “transactional” style MDM rather than traditional “registry” style MDM Adoption of specialized units/teams at organization to look after “single view”, often under different names of MDM like One Customer View Probably globally, but within Australia privacy and compliance are highly stressed on for such programs (i.e MDM programs) Utilizing AI/ML for automation, but also to increase accuracy Although AI/MI is overhyped, and everyone do want to use them; cautious is increasing, and organizations are looking for ways to have fully transparent and explainable models. What are your thoughts on these, and what do you currently experience as noteworthy trends in this area?Issam_Hijazi3 years agoQuantexa Team32Views0likes0CommentsData Quality is STILL a fundamentally unsolved issue!
Having met a couple of customers in the banking sector this week in South East Asia, a real pain that have been constantly shared was data quality. That's due to incomplete information, inconsistency, lack of standardized entries and so on. Issues that we all probably know of. Not that's a surprise of any sort, but the fact that traditional ways are still used to tackle the ever existing and growing issue is what puzzles me. In an ever evolving world where we are witnessing advancements in AI and other areas at unprecedent pace, and seeing organizations still struggle with foundational challenge should not be the case. It goes without saying the importance of proper data foundation. Anything less than that would lead to improper analytics, entity resolution, decisioning, etc. I'll probably need to (and will) write a blog about this in the coming weeks. I'm keen to hear from anyone about their approach/vision in tackling data quality issues, and what are some of your most pronounced challenges? Cheers31Views0likes0Comments