I have tables Orders (Id), Lot(Id, Name) and OrdersLot (IdOrder, IdLot). An order can accept multiple lots, and the many to many relationship is done with the OrdersLot table. How can I group the Orders that share the same exact lots?
Data Example:
| IdOrder |
| ------------ |
| 1 |
| 2 |
| 3 |
IdLot | |
---|---|
A | |
B | |
C | |
IdOrder | IdLot |
------------ | ------------ |
1 | A |
1 | B |
2 | B |
2 | C |
2 | A |
3 | B |
Based on this data, I would need to generate a unique value that is the same for Order 1 and 3 (since they both have lot A and B), and different for Order 2.
The reason is that the logic I need to extend have a while loop that groups the orders by same properties, and then cycle in a while loop each OrderGroup doing queries only on the orders having those properties.
Any suggestions?
An option I considered is removing the OrdersLot table and saving a string with all the lots name in the Order table, but this definitely doesn't seem like the most performant way.
Maybe the solution could be generating a unique value for each subset of lot (using their ids for example)?
2条答案
按热度按时间7jmck4yq1#
If you just need a unique value for sets that look the same, I'd be tempted to just create that from the unique set of values. If they're integer keys, then they're unique, and you could do something as simple as
string_agg
ing them together. Despite SQL being all about sets, doing set equivalence stuff like this is kind of a challenge.Here's what I'd do. I created some junk data below. The first two queries are guaranteed to make some duplicates there. The last set is guaranteed to not.
The statement with the
string_agg
is the one your after.aij0ehis2#
You can use STRING_AGG() to build up an ordered list of Lot IDs for each Order ID. The DENSE_RANK() function can then be used to assign distinct group IDs for each set of matching orders.
Something like:
Sample results:
Alternately, you can just group by the constructed list as part of a larger query:
Results:
See This DB<>fiddle for a demo. (Updated for recently posted sample data.)