🎥 Webinar: Data Understanding Best Practice
Head of Business Analysts, , Senior Principal Data Engineer, , and Principal Data Engineer, , present on delivery best practices for Data Understanding. Data is fundamental to the Quantexa Platform. To maximize its potential, it is crucial to have a comprehensive understanding of data when designing and configuring your Quantexa deployment. Learn the best practices for achieving optimal data understanding and enhancing the effectiveness of your Quantexa solution. Further Reading: https://community.quantexa.com/kb/categories/113-data-source-onboarding https://community.quantexa.com/kb/categories/110-data-source-onboarding-document-design https://community.quantexa.com/kb/categories/111-data-source-onboarding-entity-design https://community.quantexa.com/kb/categories/112-data-source-onboarding-data-quality🚀 Join Us for Tomorrow's Webinar: Exploring AI's Role in Tackling Data Quality Challenges
In this webinar, Dan Onions, Global Head of Data Management at Quantexa, and Martin Maisey, Head of Data Management EMEA, will delve into the pressing question on every data professional's mind: "How can AI help me?" Unlock the full potential of your data strategy: As AI technologies, particularly LLMs, become increasingly integral to data management strategies, ensuring the quality and reliability of these systems' outputs is paramount. Our experts will explore the critical role of foundational data quality in harnessing AI effectively and responsibly, and address key challenges, such as achieving consistency and accuracy in AI-generated outputs and aligning them with regulatory standards already on the horizon. Attendees will gain insights into practical applications of AI in the real world, understanding how to make AI outputs on data trustworthy across the entire organization. Learn more and register: Webinar | The Biggest Challenges in Data Quality: How Far Can AI Go to Solve Them? In this webinar, Dan Onions, Global Head of Data Management at Quantexa, and Martin Maisey, Head of Data Management EMEA, will delve into the pressing question on every data professional's mind:22Views0likes0CommentsEntity Quality Management for Data Quality?
Data quality is one of the major challenges that organizations suffer from, consuming up to 50% of data users time, costing up to 30% loss in revenue and destroying trust in data. In today's data-dependent business environment, the urgency for impeccable data quality is driven by a confluence of market forces. Fears of reputational damage due to data mishaps, the competitive pressure from digital-first competition, and an array of stringent regulations are pushing companies to reassess their data management strategies. Quantexa's Entity Quality Management (EQM) solution enhances the efficiency and effectiveness of data remediation by tailoring campaigns to align with evolving business needs. It directs remediation activities through the efficient remediation channels, from hands-on steward workflows to automated prompts in customer portals and CRM systems. Validated in production environments, Quantexa's EQM solution has integrated more than 80 data sources, achieved a 9% rate of deduplication, and uncovered 2.5 million matches that competing solutions overlooked. Simultaneously, it has effectively identified cases of overmatching and facilitated the prioritization and support of 1,000 merge reviews on a weekly basis. How does it work? Quantexa's Entity Quality Management solution seamlessly ingests a wide array of data, irrespective of its origin, embracing both internal and external sources with a model-agnostic approach that ensures rapid integration. It employs sophisticated AI/ML models to automatically parse, cleanse, and standardize data, laying the groundwork for a consistent and reliable data foundation. Source records are then transformed into an accurate single entity view, thanks to Quantexa's scalable and highly-accurate entity resolution capability. The solution assesses each record, applying a set of rules designed to identify and rectify any anomalies, such as historical overlinking, underlining, potential duplicates, and various attribute inconsistencies, enriching the data's integrity and reducing false positives. The user experience is elevated through interactive dashboards that provide dynamic visualizations, breaking down data by various quality indicators, technical specifications, and business attributes, thus enabling a thorough understanding of data quality and the focused identification of areas for remediation. By leveraging Quantexa’s EQM solution, organizations achieve situational awareness of data quality across the enterprise, enabling a focused approach to remediation. As a result, businesses can prioritize their data quality efforts efficiently, targeting areas of most significant impact and aligning remediation tasks with current business priorities and operational demands. Integration with key enterprise services facilitates a seamless and effective data quality management process.192Views1like0CommentsRecording for The Power of “Contextually Connected Entities” for Maximising Decision Intelligence
Hi everyone! If you've missed the webinar on The Power of “Contextually Connected Entities” for Maximising Decision Intelligence, you can now watch it on demand using the below link! https://info.quantexa.com/decision-intelligence-webinar-apac-deloitte We're keen to hear your thoughts on the topic, so please leave it below! Cheers81Views1like0CommentsWhat is Entity Quality?
Most organizations are concerned with data quality. What's usually neglected is the Entity Quality. But what exactly is Entity Quality? Entity Quality, or EQ for short, is a way to determine the quality of a resolved entity. Say we get a resolved entity based on matching 3 different records, based on some matching logic. The question that arises here is: do these 3 records actually represent the same real-world thing/entity? How confident are we that they do? Entity Quality is about that. It's a way to give end users confidence, via a calculated/aggregated score. At Quantexa, we provide Entity Quality Scoring, or EQS, a way to provide confidence scores for resolved entities. It determines over/under linking in entities. Check out a nice short blog on the entity quality overlinking tool that has been developed based on this functionality. That said, this functionality is embedded in the product, and users can view the generated scores easily, and investigate any entity (if/when needed). Do you measure entity quality at your organization? If yes, how? If no, do you think this can be useful? Please share your thoughts!911Views1like0CommentsAug 3 Webinar: The Power of “Contextually Connected Entities” for Maximising Decision Intelligence
As defined by Gartner, Decision Intelligence improves decision-making by understanding and engineering how decisions are made, and how outcomes are evaluated, managed, and improved by feedback. Using artificial intelligence, Decision Intelligence unlocks the power of entity resolution and graph analytics to turn data into accurate decisions at scale. This means that data becomes more than the sum of its parts– not piecemeal, siloed, and afflicted with poor quality. Decision Intelligence connects all data from previously siloed and scattered points and creates as single trusted, and reusable resource. Join me and Melissa Ferrer, Partner, Data & AI, at Deloitte in a Live webinar as we delve deeper in to Decision Intelligence, and the importance of Contextually Connected Entities in having informed decisions. Event page on the community: Webinar: The Power of “Contextually Connected Entities” for Maximising Decision Intelligence The business world is more data-driven than ever, but making sense of the vast amounts of data available can be a daunting task. Many businesses struggle to bridge the gap between data and decision making, and often lack a comprehensive view of their customers. OR direct event landing page: https://info.quantexa.com/decision-intelligence-webinar-apac?hs_preview=Novfdgge-122643133495111Views1like0CommentsExploring the Challenges of Achieving a Single View in Datawarehousing
When it comes to achieving a single view of individuals or businesses in Datawarehousing, several key insights emerge: 1️⃣ Data Integration: Integration is a critical aspect. Organizations often struggle with merging data from disparate sources such as customer databases, transaction systems, and marketing platforms. Ensuring seamless data integration is essential for a unified view. 2️⃣ Data Quality: Data quality plays a vital role in establishing a reliable single view. Inaccurate, incomplete, or inconsistent data can hinder decision-making and analysis. Implementing data cleansing processes and validation mechanisms are crucial steps towards maintaining high-quality data. 3️⃣ Data Silos: Data silos, where information is isolated within different systems or departments, pose a significant challenge. Overcoming these silos requires breaking down barriers, implementing data governance practices, and establishing data sharing mechanisms. 4️⃣ Business Context: Contextual understanding is crucial for creating a comprehensive view. Data needs to be interpreted within the specific business context to derive meaningful insights. Adapting to evolving business requirements and aligning data consolidation efforts accordingly is vital. Questions I always have and continue to ask are: 🔸 How do you address the complexities of data integration when combining data from diverse sources? 🔸 What approaches have you found effective in ensuring data quality throughout the process? 🔸 Have you encountered challenges in breaking down data silos? How did you overcome them? 🔸 How do you incorporate the business context into your data consolidation efforts? 🔸 Are there any specific tools or technologies you recommend for achieving a single view in Datawarehousing? Please share your thoughts and let's learn from one another!151Views1like0CommentsData Quality is STILL a fundamentally unsolved issue!
Having met a couple of customers in the banking sector this week in South East Asia, a real pain that have been constantly shared was data quality. That's due to incomplete information, inconsistency, lack of standardized entries and so on. Issues that we all probably know of. Not that's a surprise of any sort, but the fact that traditional ways are still used to tackle the ever existing and growing issue is what puzzles me. In an ever evolving world where we are witnessing advancements in AI and other areas at unprecedent pace, and seeing organizations still struggle with foundational challenge should not be the case. It goes without saying the importance of proper data foundation. Anything less than that would lead to improper analytics, entity resolution, decisioning, etc. I'll probably need to (and will) write a blog about this in the coming weeks. I'm keen to hear from anyone about their approach/vision in tackling data quality issues, and what are some of your most pronounced challenges? Cheers31Views0likes0CommentsTo use a Graph DB or not?
Graph databases (Graph DB) are gaining popularity as an alternative to traditional relational databases due to their ability to manage highly interconnected data. However, the decision of whether or not to use a Graph DB is dependent on several factors and is not straightforward. Some users of Graph DB technology have expressed frustration due to a lack of tangible business outcomes after investing in projects for two years or more. They are now questioning the need to continue using the technology and paying for licenses and maintenance. Some of the most complained about aspects were performance issues, complexity and flexibility. It is essential to note that Graph DB technology is not a "magic stick" and requires significant pre-work to create meaningful connections between data. Despite its advantages in managing complex data relationships, Graph DB is a data store after all, and has some shortcomings. If you have experience using Graph DB , I would love to hear about your experience and what use cases you've use/ed it for. Cheers101Views0likes0CommentsRethinking MDM and Our Approach to It - Complete Blog Series is out!
Hello folks, The complete blog series about "Rethinking MDM and Our Approach to It" is now published. Here are the links for each individual part: Part 1/3: There is no such thing as a "Single View" Part 2/3: Data isn't a first-class citizen Part 3/3: Traditional techniques & rigid models ought to die Please share your thoughts on each of them. Happy reading!21Views1like0Comments