How do I create a statement to add additional rows that duplicate the data from already existing rows?

xyhw6mcr  于 2023-02-28  发布在  其他
关注(0)|答案(1)|浏览(110)

Here is a simplified example of what my output currently looks like:
| Name | Month | Year | Filter | Data |
| ------------ | ------------ | ------------ | ------------ | ------------ |
| Bob | Feb | 2018 | Good! | 10 |
| Bob | Apr | 2018 | Bad | 0 |
| Tom | Dec | 2019 | Good! | 5 |
| Tom | Feb | 2020 | Bad | 0 |
| Sam | Feb | 2018 | Good! | 2 |
| Sam | Apr | 2018 | Bad | 0 |
| Tim | Oct | 2022 | Good! | 50 |
| Tim | Jan | 2023 | Bad | 0 |

My Goal: I'm trying to duplicate any record and apply new months to it if my filter = Good! Within my data, there is gaps between months and I need to have everything in sequential order by month until my filter = bad. Ultimately I need for my output to look like this:

NameMonthYearFilterData
BobFeb2018Good!10
BobMar2018Good!10
BobApr2018Bad0
TomDec2019Good!5
TomJan2020Good!5
TomFeb2020Bad0
SamFeb2018Good!2
SamMar2018Good!2
SamApr2018Bad0
TimOct2022Good!50
TimNov2022Good!50
TimDec2022Good!50
TimJan2023Bad0

Here is what SQL output looks like exactly: enter image description here here I am grouping by LotSysID, VehicleMainSysID, Month Number, YearNumber, and Description. Instead of name in my example output, I am using VehicleMainSysID and LotSysID. Instead of Filter I am using description, bad = Transferred. Good=Retail Inventory.

Essentially, I have data from 2018 up until now and I need to include any gaps of months for when filter = Good! until it equals Bad. There may not even be an additional row where filter = bad, in those cases I would just need to have the data continue to be duplicated until this current month for this current year. NOTE: there is no set start date where all the data begins, depending on the name, the data can start in any month between years 2018-current. I need a dynamic statement where I can create additional rows to fit my business logic.

I've thought of using a recursive statement but was told that it would be incredibly taxing on our server since I'm pulling millions of records through my query.

pengsaosao

pengsaosao1#

Here's one way that finds first month and last month (when the last month has 'Bad' ) for each name, then joins that to a list of all months from the earliest date in the data until today, then uses OUTER APPLY to find the previous value to determine what to fill the gaps with.

Working example in this db<>fiddle - including the edge case where a person's latest Filter = 'Good' .

DECLARE @BeginOfMonth date = DATEADD(DAY, 1, EOMONTH(GETDATE(),-1));

;WITH Origin AS 
( /* let's get the data and fix the year/month column */
  SELECT Name, Filter, Data, 
    [Month]   = DATEFROMPARTS(YearNumber, MonthNumber, 1),
    LastMonth = MAX(DATEFROMPARTS(YearNumber, MonthNumber, 1)) 
                OVER (PARTITION BY Name)
  FROM dbo.whoknows /* we don't know your table name */
), Agg AS
( /* let's get the first and last month for each name */
  SELECT Name, S = MIN([Month]), E = COALESCE
         (MAX(CASE WHEN [Month] = LastMonth AND Filter = 'Bad' 
         THEN [Month] END), @BeginOfMonth)
  FROM Origin GROUP BY Name
), AllMonths(m) AS
(
  /* let's get ALL the possible months to fill gaps */
  SELECT MIN(s) FROM Agg UNION ALL
  SELECT DATEADD(MONTH, 1, m) FROM AllMonths
  WHERE m < @BeginOfMonth
)
SELECT Agg.Name, 
  [Month] = DATENAME(MONTH, am.m), 
  [Year]  = YEAR(am.m), 
  Filter  = COALESCE(o.Filter, GapFill.Filter),
  Data    = COALESCE(o.Data,   GapFill.Data)
FROM AllMonths AS am 
INNER JOIN Agg ON am.m >= Agg.S AND am.m <= Agg.E
LEFT OUTER JOIN Origin AS o
  ON o.Name = Agg.Name AND o.[Month] = am.m
  OUTER APPLY 
  (
    SELECT TOP (1) org.Filter, org.Data
    FROM Origin AS org
    WHERE org.Name = Agg.Name AND org.[Month] < am.m
    ORDER BY org.[Month] DESC
  ) AS GapFill
ORDER BY Agg.Name, am.m OPTION (MAXRECURSION 1000);

Output:

NameMonthYearFilterData
BobFebruary2018Good!10
BobMarch2018Good!10
BobApril2018Bad0
SamFebruary2018Good!2
SamMarch2018Good!2
SamApril2018Bad0
TimOctober2022Good!50
TimNovember2022Good!50
TimDecember2022Good!50
TimJanuary2023Bad0
TomDecember2019Good!5
TomJanuary2020Good!5
TomFebruary2020Bad0

相关问题