1
Good morning,
Does anyone know if there is a possibility to delete duplicate records as follows:
I have a query that brings the result
All fields repeat themselves except the NDO field, and Ndogroup, it is these fields that are duplicating the records.
I am obliged to put these two fields in my group by when I give the select.
Example:
SELECT
AD.[vch_od],
MK.NDMarket,
MK.TypeMercado,
MK.NDOrigin as Praça,
TR.Analyst,
-- DTB.Season,
-- DTB.EvDef,
AD.[dtt_data_captura],
AD.semana,
AD.DataMes,
AD.NDO,
AD.NDOGroup,
-- PR.valorpremium,
AD.[Flt Type],
IND.[Flt Type],
AD.[Flt_base] as AD,
G3.[flt_base] as G3,
JJ.[flt_base] as JJ,
OC.[flt_base] as OC,
P3.[flt_base] as P3
-- IND.[flt_base] as IND
from #TARIFASAD AD
GROUP BY
AD.[vch_od],
MK.NDMarket,
MK.TypeMercado,
MK.NDOrigin,
TR.Analyst,
-- DTB.Season,
-- DTB.EvDef,
-- AD.NDOGroup,
AD.[dtt_data_captura],
AD.semana,
AD.DataMes,
AD.NDO,
AD.NDOGroup,
-- PR.valorpremium,
AD.[Flt Type],
IND.[Flt Type],
AD.[Flt_base],
G3.[flt_base],
JJ.[flt_base],
OC.[flt_base],
P3.[flt_base]
-- IND.[flt_base] as IND
Today my result is this:
AJUREC RECAJU Monopoly REC Diego 12/1/2016 11/27/2016 DRT CNX 415.843396
AJUREC RECAJU Monopoly REC Diego 12/2/2016 11/27/2016 DRT CNX 303.539575
BPSSSA SSABPS Super-Competitive SSA Diego 12/1/2016 11/27/2016 DRT CNX 290.775
BPSSSA SSABPS Super-Competitive SSA Diego 12/2/2016 11/27/2016 DRT CNX 217.514213
No duplicate record. Brings me the average rate per catch date (grouped)
I need to insert the two columns NDO, and Ndogroup
who are:
NDOGroup
(15-21)
(15-21)
(22-29)
(22-29)
(15-21)
(15-21)
NDO
15
17
20
22
27
29
datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data]) AS NDO,
case
when datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])< '1' then 'NDO 00'
when datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])< '5' then'(1-4)'
when datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])< '8' then '(5-7)'
when datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])<'15' then '(8-14)'
when datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])<'22' then '(15-21)'
when datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])<'30' then '(22-29)'
when datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])<'50' then '(30-49)'
else '50+'
END AS NDOGroup
I understand that the field dtt_data, shown below example, will differentiate. As much as dtt_capture is on 12/01/2016, in several lines the field dtt_data will be different. So in the above datadiff will sometimes give me the difference of 1 day, times 3 and so on.
dtt_data_captura dtt_data
12/1/2016 13/3/2016
12/1/2016 16/3/2016
12/1/2016 21/3/2016
12/1/2016 17/3/2016
But I would like to find a way to bring Ndogroup from one group without repeating: Ndogroup (15-21) (22-29)
case
when datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])< '1' then 'NDO 00'
when datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])< '5' then'(1-4)'
when datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])< '8' then '(5-7)'
when datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])<'15' then '(8-14)'
when datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])<'22' then '(15-21)'
when datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])<'30' then '(22-29)'
when datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])<'50' then '(30-49)'
else '50+'
And with a single Ndogroup field bring the corresponding NDO:
datediff(dd,QL2.[dtt_data_captura],QL2.[dtt_data])
I thought of something more or less, deleting records from the Ndogroup and NDO field bringing me the smallest or largest
DELETE a FROM #TARIFASAD AS a, nomes AS b WHERE a.vch_od=b.vch_od AND a.NDOGroup < b.NDOGroup
Thank you
Did not delete any record. I am checking other filters
– Thais
Strange. If you only do the first select returns records. Right? Making the second one should return too, so delete has to delete for sure.
– Reginaldo Rigo
Yes, and error at the end )) AND ....
– Thais
I did a little update on the answer.
– Reginaldo Rigo
Hi, I couldn’t. Still error and duplicating
– Thais
Let’s go. Delete worked and deleted several items. But if you remake your original select these double items that were already there, there they continue. Have they not been deleted? That’s it?
– Reginaldo Rigo
I put this your answer in my query and it does not run, even if I try to adapt. It gives an error in parentheses that I can not identify. If I adapt taken they do not delete the records
– Thais
I added a script for you to test.
– Reginaldo Rigo