Briefing Study: Requirements Analysis and Specification (slide 14)
Using the Logical Data Modelling technique
In principle, a logical data model consists of a diagram (a Logical Data Structure diagram or an Entity Relationship Diagram) showing business entities, and their inter-relationships, names of the relationships and the cardinality i.e. each customer can place one or more orders, an order can be placed by one and only one customer - these are the relationship descriptions when viewed from both ends. You can also read from them that the cardinality is one customer, many orders. You have a think about an Order-line entity and its relationship and cardinality with Order and Product.
(If you need to, see a description of the notation used in LDS diagrams here...)
In addition to the diagram, you will have each entity listed, with its unique identifying key and the business attributes that make it up. For example, one of my entities is Order, its unique identifying key is a combination of the customer's key and an order number. The Order entity has attributes such as Date-order-placed, Email-address to be used for delivery and a Notes attribute for any special instructions or notes from the customer appearing on an order.
You will also record the current and expected volumes against each entity and the expected growth rate. For example, you might get a rough average of five order lines on each order. Record that against the Order-line entity. Using the expected number of orders per period work out how many order lines per period that is and record that too.
Cross validation with other techniques
If you are following these Briefing Study slides in sequence, you should be able to see some immediate cross-validation checks with the DFM:
- Do all the entities appear in datastores?
If not, is a dataflow missing? Is an I/O Description missing something? If so, some aspect of a process has been missed, or indeed even a whole process.
- Also remember to consider static or reference data.
At this point it's quite common to notice that processes have been overlooked in the DFM to cover creation, modification and deletion.
- If you have needed, because of the project landscape, to go to the
lengths of describing all the attributes on each dataflow:
- do all the identified business attributes on the LDM appear on at least one I/O Description?
- If not, where do they come from?
- And, vice-versa, are all business attributes on the I/O dataflows accounted for in the entities on the LDM?
Best Practice Tip
The type of logical data model of most use to the business analyst is one which only or mostly models just the business-recognisable entities. This is because they're not the easiest of diagrams for non-practitioners to understand or follow.
You need to get the balance between what is a high enough level to be understandable by the business (for use in workshops or in one-on-ones) and what is precise enough for requirements analysis and logical design.
On large projects with lots of data entities you might end up with an additional lower level LDM, one that allows the entities to be more easily mapped to the enterprise data model. That's the one guarded ferociously by the database admistration team.