Showing posts with label DATA QUALITY. Show all posts
Showing posts with label DATA QUALITY. Show all posts

Thursday, 23 June 2022

WEBINAR: Top 3 ways IMDb’s third-party data enhances user engagement and experience - 30 June 2022

 

 
Amazon Web Services
 
You're invited!

In this webinar, AWS joins hands with IMDb to showcase the top 3 ways third-party data improves user engagement and experience. Featuring step-by-step demonstrations and technical use cases, experts in the Media and Entertainment industry share tips on captivating audiences while reducing churn rates.

Register now

 
THURSDAY, JUNE
30
 
 
North America
2:00 PM EDT | 11:00 AM PDT
Length: 60 minutes
 

You will learn:

  • How third-party data provides additional context to first-party data
  • Solutions to common data analytics challenges
  • Ways that IMDb’s metadata on more than 12M+ cast and crew and 8M+ movie and TV titles can help you create better experiences for your users
  • How easy-to-learn AWS services, such as Amazon Neptune and Amazon Personalize, provide additional resources to improve customer satisfaction
  • How to find, discover, and use third-party data and APIs from AWS Data Exchange

Who should attend:

BI managers and executives, product managers, Heads of Marketing, CMOs, Chief Data Officers, data scientists

*The views and opinions of IMDb and their presenters are their own and do not necessarily reflect the positions of AWS

Register now
 
 

Wednesday, 4 December 2019

Nordic data debacles tell story of numbers that aren’t true by Nick Rigillo and Catherine Bosley via @infomgmt

Scandinavia is offering a fresh case study this month in how even the world’s richest countries can struggle to measure their own economies and trust the data.

This is a lesson which we should all learn from and use it to make absolutely sure that we are sure of our numbers and the data source as well as the methodology we use to make any calculation within analytics.

Monday, 11 November 2019

When it comes to data, why the 'garbage in, garbage out' doctrine is all wrong by Michael Kanellos via @infomgmt

The problem is that there’s way too much of it and it’s not organized in a way that makes it easy to understand. It doesn’t form beautiful crystalline patterns like salt: it’s more like a huge pile of gravel.

It's clear to me that you can check the quality of your data, but you shouldn't throw away anything that doesn't match your vision or correctness. Flag it as not being "right" but don't lose it - it could still give useful insights.  Think of it this way - financial data must equal what is going into the financial ledgers. If you include the bad data it probably will. just make sure you mark r it in some way.

Friday, 12 July 2019

The quality of its data can make or break an organisation by Bob Violino via @infomgmt

High-quality data can improve decision making, customer service, business processes and competitiveness. Poor quality data can potentially lead to financial ruin.0

Time and time again I've explained the need for quality data. If the data is not in a structure you understand or clean the results will not be reliable and will be complete rubbish.

Friday, 28 June 2019

How to Ensure Data Consistency and Quality by Sally El Hadidi via @Datafloq

Poor data quality negatively impacts your company on many levels. Not only does it lead to bad decision-making, but it can be costly as well.

Some great suggestions. My own suggestions are:
Make sure that data definitions are consistent - e.g. is a customer number the same format in all sources?

  • Do accurate and detailed documentation to map the data in each system and how to join it together - you might be surprised just how difficult that is.
  • Use validation everywhere to make sure that the quality of the data is good everywhere.  e.g. use drop down boxes or master tables to make sure that all fields that could be used to group or classify are a set of standard values.

Wednesday, 19 June 2019

Poor data quality causing majority of artificial intelligence projects to stall by Bob Violino via @infomgmt

Nearly eight out of 10 organizations using AI and ML report that projects have stalled, and 96 per cent of these companies have run into problems with data quality, says a new study.

This is a reason for the failure of many different techniques in many different projects. It is absolutely VITAL that you have good quality data in your systems and that there is good referential integrity across all joins.

Friday, 3 May 2019

Success with online sales starts with strong data quality by Susan Pichoff via @infomgmt

Selling online works well when complete and accurate product information is readily available, easily found and reliable.

I would add to Susan's article that Data Stewardship is key as you need to have key people take responsibility for the product data describing them and a) ensuring it is correct plus b) being responsible for quickly correcting it when errors are found.  Susan is also right that incorrect or incomplete information on products turns the customer off and they are very unlikely to purchase from you.

Friday, 19 April 2019

Data quality issues - Who is responsible for resolving them? by Nicola Askham via @infomgmt

Your governance team will develop knowledge and expertise about the data your organization creates and uses, but they are not responsible for any cleansing that may be required.

Some great advice from Nicola which could form the centre of your own organisation's data quality and data stewardship strategy.

Monday, 15 April 2019

Morgan Stanley focuses on data quality to strengthen AI by Penny Crosman via @infomgmt

The bank is one of many to realize that artificial intelligence is only as good as the data fed into it.

They have it completely right - you absolutely cannot rely on answers if the underlying data is wrong in any way.

Wednesday, 23 January 2019

Strong data quality key to success with machine learning, AI or blockchain by Tendü Yoğurtçu via @infomgmt


Enterprises must be skeptical of data as it essentially determines how the AI will work and bias in the data may be inherent because of past customers, business practices and sales.

The past bias could be inherent in the data due to the design of legacy systems, the team typing it in, the customers, the way it was governed by the business or a combination of them.  Historic data should therefore always be treated with great suspicion until you have completed an exercise to check the systems, the data meanings, standards and governance. Please don't make huge business decisions based on data you can't prove it clean and unbiased.

Tuesday, 20 November 2018

WEBINAR: Transforming 3rd Party Data Into Actionable Insights - 28 November 2018



Register Now!
The rise of third party or external data has given data scientists and organisations additional building blocks to discover breakthrough insights. But many data scientists struggle to understand what third party data is relevant and struggle further to efficiently access and transform that data.

In today’s Data Science Central webinar, we’ll explore innovative techniques to simplify third party data access and transformation.

You will learn:
  • Techniques for assessing third party data quality and relevance
  • Strategies for accessing third party data
  • Information about the third party data landscape as it applies to business outcomes

Speakers:
Mark Hookey, CEO -- DemystData
Richard Scioli, General Manager, Platform -- DemystData

Hosted by: Bill Vorhies, Editorial Director -- Data Science Central

Title: Transforming 3rd Party Data Into Actionable Insights
Date: Wednesday, November 28th, 2018
Time: 09:00 AM - 10:00 AM PST

Space is limited so please register early:
Reserve your Webinar seat now

After registering you will receive a confirmation email containing information about joining the Webinar.

Wednesday, 7 November 2018

3 best practices for improving and maintaining data quality by Maxim Lukichev via @infomgmt

Organisations are increasingly relying on insights generated by data analysis, and they realise that insights are only as good as the data they come from.

Maxim makes some very good points in here.  I think any data analysis with bad data is at best worthless and at worst destructive for your business as you will be making key decisions based on something which is not correct. It is important that you validate your data to make sure it is trustworthy and have a network of data stewards in your business to ensure that data is correct and processes and in some cases systems are updated to make sure that quality is improved and assured going forward.

Wednesday, 3 October 2018

Building the ideal data quality team starts with these roles by Wilfried Lemahieu, Seppe vanden Broucke and Bart Baesens via @infomgmt

Poor data quality impacts organisations in many ways. At the operational level, it has an impact on customer satisfaction, increases operational expenses and will lead to lowered employee job satisfaction.

Great list of job roles and a blueprint of roles that we could all aim for if we understand each one of them.

Monday, 13 August 2018

Data veracity challenge puts spotlight on trust by Pat Sullivan via @infomgmt

The data veracity challenge is one that most businesses have yet to come to grips with, but if we’re to fully harness data for the full benefit to businesses and society, then this challenge needs to be addressed head on.

I think automation of reports are great for businesses yes, but as this article from Pat says/suggests, you absolutely have to be confidence in your data, that you can rely on the quality of that data, that you know the journey of that data from the original source into wherever you use it from in your reporting, that you understand the meaning of the data (data management), that you can join it with other data and produce something useful and that any data analysis/visualisations/algorithms are correctly defined and are not biased if your business is going to be run using it and investment that is based on it is not wasted.

Monday, 6 August 2018

How to spot bad data, and know the limitations when it's good by Kayla Matthews via @infomgmt

Accurate and reliable data can bring context to research studies, help people understand trends, aid business managers in knowing what’s working well for achieving company goals and much more. However, not all data is as beneficial as it seems at first.

I completely agree with Kayla's observations. If you have a large enough team I think to reduce the possibility of bias it would be a good practice to get a colleague to prepare the data for you with only a vague idea of what you need. I cannot stress enough that in order to guarantee the quality of your data you need to take it from the system of record, not have had it modified before you receive it, and ensure that you really understand what the data elements really mean.

Friday, 22 June 2018

How to know when data is 'right' for its purpose by Annette Wright via @infomgmt

The key to evaluating the accuracy of data is more about understanding the eventual use of it than any arbitrary or independent measure.

I agree with Annette although I would bring your attention to some ways to try and make sure that data is correct. 

1..For codes always provide values to select from - yes you cannot guarantee the value chosen is the right one but it is a major step forward just to ensure that there are a finite list of values for that field.

2. For some fields use publicly available data to try and limit data entry to valid values - examples could be master postal code lists, master lists of registered companies, master lists of ISO values for items like a country number, language code, etc.  Yes you cannot guarantee that the correct value is selected but you can at least make sure that the value selected is from a finite master list AND is a valid value.

3.  Make sure that all customer facing systems give the customer a mandatory chance to check and correct their data.

Update your processes to ensure that system design takes all of these things into account - time for a culture change to make sure data quality is a top priority in your organisation.

Monday, 7 May 2018

Getting data management right to drive business action by Anna Johansson via @infomgmt

By developing clear objectives and pairing them with actionable data classes, you can propel your business to the top.

It really is crucial that you look at the quality of your data and make sure that it is clean so that it is usable.

Monday, 30 April 2018

Organizations gaining new benefits by automating data engineering by Jelani Harper via @infomgmt

A number of advancements have now decreased data preparation time while increasing the time available for exploration and applications.

I think there are several possible tools that enable users to do their own querying and this article talks about the one that the author is most familiar with.  Some organisations use Tableau, Pentaho or Power BI. I'm sure there are others I have not listed.

Friday, 20 April 2018

If Your Data Is Bad, Your Machine Learning Tools Are Useless by Thomas C. Redman via @HarvardBiz

Poor data quality is enemy number one to the widespread, profitable use of machine learning. While the caustic observation, “garbage-in, garbage-out” has plagued analytics and decision-making for generations, it carries a special warning for machine learning.

Very good advice in this article that I think should be bookmarked or notes should be taken because you really need to be following these steps if you want to be successful with machine learning.

Friday, 23 March 2018

SLIDESHOW: 20 best practices of top chief digital officers by David Weldon via @infomgmt

Growth in digital transformation efforts in healthcare is driving the need for more CDOs, who need to be versed in strategy, governance and execution.

Slide 11 if really key - you need to get the C-level people behind you and seeing the benefits of the advantages of what you are trying to achieve.