I ran into a fascinating problem today, but I'm pretty sure it's not unique.
Surely someone on here has thought about this before:

Imagine you have a table of Person and a table of Food, and you have a
PersonFood table that models which people like which food using two foreign
keys, such as PersonID and FoodID.  Simple enough

Now, that's great if Sally like apples, and Billy likes apples and walnuts,
and Joe likes tacos.  But what should be done about BigBob, who likes *all*
foods?

If we just mark off BigBob as liking all the foods using the PersonFood then
any time a new Food is added , BigBob will be out of date (he really does
like *all* foods!). Yet if we create a special food called "All Food" and
mark BigBob with that then the "count" of how many foods he likes (which is
really just the count of all Foods) is equal to one instead of the actual
food count, and it's still possible to mark BigBob as liking "All Food" as
well as "apples" and get a food count of 2!  ugh. And it's just as wrong to
say "well anyone who does not have any records in the PersonFoods table
means that they like all foods" because then you have no way of showing a
Person who really like zero of the Foods. Ugh, again.

So I'm curious, what do you much more experienced DB guys do in such a case?
Solve it with a stored procedure? Use a boolean field in the Person table
for "all food eaters"? Or is there an actual way to model this with a field
in the PersonFood table?

thanks for any hints!
[Todays Threads] [This Message] [Subscription] [Fast Unsubscribe] [User Settings]

Reply via email to