I have a spreadhsheet where I have column that contains duplicate entries.
For example: what I have is this:
1
1
1
2
2
2
3
The column result I need is this:
1
2
3
W
I have a spreadhsheet where I have column that contains duplicate entries.
For example: what I have is this:
1
1
1
2
2
2
3
The column result I need is this:
1
2
3
W
If your duplicates are in column A and start in A1 then put this in column B
=IF(COUNTIF($A$1:A1,A1)=1,A1,"")
:tumble:
If you would like to get the numbers one aftert the other then use the formula below
{=INDEX($B$1:$B$2150,SMALL(IF(ISNUMBER($B$1:$B$2150),ROW($B$1:$B$2150)),ROW(1:1)))}
:lol::cool:
Amazing.
Works like a charm.
Thank you very much for the help.
I am have a problem with your last formula.
The first cell in the column (where I original put the formula) works fine.
However, the next cell(s) have the #NUM error. How can this be solved?
thanks for your help.:o2
The free addin from http://www.asap-utilities.com performs this and many other functions.
VBA example attached, I prefere to select duplicates as a check and then use delete
This method is not the tell all! It will get the job done but uses the formula twice to get the result. This will stop the number error. But as a rule you should always try and get around using functions more than once if possible
{=IF(ISERROR(INDEX($B$1:$B$2150,SMALL(IF(ISNUMBER($B$1:$B$2150),ROW($B$1:$B$2150)),ROW(1:1)))),"",INDEX($B$1:$B$2150,SMALL(IF(ISNUMBER($B$1:$B$2150),ROW($B$1:$B$2150)),ROW(1:1))))}
Array entered (Ctrl+Shift+Enter):lol:
Yeah, that is quite a formla:smug:
hmmmm
try advanced filter, selecting unique values only, placing the results either in-place or in another column, it will remove any duplicates in a flash
the above formulae are pretty impressive, but as array formulae, filled with volatile =row functions, you will notice a performance degredation...
advanced filter has the same facility built in and does it in a flash...... downside is that you have to refresh it, but hey, we can't have everything perfect, eh ?!
:spin:
Hi the OP request that the duplicated are removed. This will take some kind of action to perform this, formulae and advanced features in open excel [example would be the wonderful suggestion Chris D offered] Advanced Filter which will work, but only hides the non required [filtered out] data rows.
To achieve exactly what you require we need to look deep in the back bone of Excel, the world of VBA.
There are so many different ways of doing this task, no one can claim to cover them all, if you check Dave Hawley working examples there is a script called Remove Duplicates I recall and this will kill off for good any duplicates, also I will provide some quality codes for you to try.
Remember to experiment on a worthless document not your real work, then test on a copy of your real work until you are 100% completely happy, VBA in this action can not be reversed, we are deleting as has been requested.
I assume readers can add VBA Script to a command button, if this is not the case the finest full step by step instruction to the method of this I highly suggest checking one of Dave Hawleys newsletter. At the end of each edition a wonderful easy to follow guidelines are offered.
HTH
Jack in the UK
BTW I must add that Excel by default design will delete 2nd and latter reoccurrences of the duplicate, that means the 1st [original] is left intact, this is worth remembering, this can be worked around, but takes some complex syntax and extensive code to depict which takes the hierarchies and in which case.
Code 1:
This is a code that will do as you require, I felt just for fun, and would tie in nicely if we looked at advanced filter method, and I assume the data to check resides in column A, and also assumes you have a header row title.
Sub Remove_Duplicates_()
Dim n As Range
[a:a].AdvancedFilter Action:=xlFilterInPlace, Unique:=True
Set n = [a:a].SpecialCells(xlCellTypeVisible)
ActiveSheet.ShowAllData
n.EntireRow.Hidden = True
[a:a].SpecialCells(xlCellTypeVisible).EntireRow.Delete
[a:a].EntireRow.Hidden = False
End Sub
Jack,
advanced filter will filter them either in-place or in another place - it doesn't just "hide the filtered rows" at all.....
it thusly achieves exactly what the OP is asking for
Hi Chris
This is a good solution but reading the OP question of deleting I used the skills of auto filter to delete duplicates.
Agreed we can move / transfer the data around, but only in one sheet, can not be transfer of sheet specific, so I chose my method. Else you will have twice as much data in one sheet. I guess we could as an alternative to delete and to preserve the original data [always a good idea] use auto filter to unique vales and copy visible cells only and transfer them on the fly in VBA and return the data back to normal, this is possible and offers another insight alternative.
This code I posted is not mine, it’s from the WWW somewhere, and very commonly used, I have developed to do all sorts, but left to be simple. Try it and you fine its very powerful. Neither solution is better, just a different approach, I just deleted as was asked.
Its complex to understand Excel general functions and tools to convert to VBA until you have grasped them, and I still find most extremely tricky, take ages to perfect a routine from top to bottom.
Jack
I still think it is much safer to select duplicates in some way - either AutoFilter or VBA. These entries can then be checked before deletion.
Here's a VBA way to delete duplicates, but is there is no way back, except to close the file without saving changes
Hi Roy
Your code is the same as I have posted – nice script very adaptable I find, as I said very commonly used by many. This thread have proved of interest ill design some wonderful code to do a bit more that can hopefully cover the issues and concerns that all posters have raised, so the richness of this feed is only added to.
There’s many ways to skin a cat so they say, I can only offer my idea then guys can play and utilize the ideas, code their way. With luck develop them to their requirements.
Jack in the UK
BTW a lot is a refecttion of how people work, so input is good, i tend only to use auto-filter as ive outlined in above posts, im my opinion i do not like range data twice or selected range data on the same sheet, i feel it gets comfusing to some. Well my opinion that is.
In addition to eliminating the duplicates you may at some time wish to tally the number of duplicates, like how many times does NY or CA or 3 or 1 show up.
=COUNTIF(A:A,INDEX(A1,1,1)) in B1 produces:
A1 B1
1 3
2 2
2 2
1 3
1 3
Just a quick way to total up the duplicates.
mscallisto
Even though the OP was looking for something alittle bit different a slightly better approach to your formula would be
This way you do not have as many numbers.
=IF(COUNTIF($A$1:A1,A1)=2,COUNTIF($A$1:$A$100,A1),""):bouncy::D
thank you all for your support.
I would like to explain what I am trying to do with this spreadsheet.
Everymonth, I have a log that calculates the percentage of revenue to expenses (Gross Margin).
Sometimes, the GM will be calculated not the next month, but in maybe two or three months down the line. So, the same Sales order # may appear in one month as "0.00%" but in another month will appear at either the same percentage --which means a calculation hasn't been done for it--or a percentage will appear next to it.
On this spreadsheet, I copy and paste the new log on top of the prior months information. So, it is possible that in Jan, for example, I will have something that occured in July in the Dec portion because it had a "0.00%" next to it.
So, what I have to do is go back and delete the prior months duplicates so I can keep an accurate Year To Date running total.
Does this explanation help clarify what exactly I need. I would like to put a formula next to the current month entries and find out if it appear in a prior month so I can delete it.
Thank you all for your support.
If you need to see the file, I attached it yesterday.
Yes PJ KARAFFA I do realize that this deviates from the original question, it was as stated just a bit of extra info.
Your formula is also very creative and very usefull, I'll add it to my goodie box.
Mine was meant to be a quick way to visit any row (of many unsorted rows) and quickly know a count of occurances.