It seems that these schemes often start as an honest mistake, and if no one notices, then a fraudster 'accidentally' does it again.
It is critical to monitor transactions in order to pick up fraud as well as closing the loophole that was exploited in order to create it. In a perfect world it would be great if systems were designed, coded and tested perfectly but holes can and will be found so monitoring and closure is key to minimise losses.
This is a blog containing data related news and information that I find interesting or relevant. Links are given to original sites containing source information for which I can take no responsibility. Any opinion expressed is my own.
Showing posts with label DATA DESIGN. Show all posts
Showing posts with label DATA DESIGN. Show all posts
Thursday, 20 April 2017
Tuesday, 27 May 2014
7 deadly sins of database design
In this white paper in Information Management sponsored by Embarcadero it goes through what they consider to be the 7 deadly sins of database design.
Whilst I do roughly agree with them on their 7 I would either add or update the list:
5. Data quality can be implemented using alternatives to a foreign key or check constraint on the database. If you are following an object oriented approach to the data you can create a common method that ensures that quality that can be enforced without adding database objects that coulld slow down any update/insert into that table on the database.
8. Changes in company documentation or modelling standards over time often result in mismatched levels or standards for each artefact. It would be great if there were time allowed to update older documentation as standards change. If that was impractical then any project plan touching those items should include time to update artefacts to the new standard.
Whilst I do roughly agree with them on their 7 I would either add or update the list:
5. Data quality can be implemented using alternatives to a foreign key or check constraint on the database. If you are following an object oriented approach to the data you can create a common method that ensures that quality that can be enforced without adding database objects that coulld slow down any update/insert into that table on the database.
8. Changes in company documentation or modelling standards over time often result in mismatched levels or standards for each artefact. It would be great if there were time allowed to update older documentation as standards change. If that was impractical then any project plan touching those items should include time to update artefacts to the new standard.
Subscribe to:
Posts (Atom)