Hardware

Find And Remove Duplicate Files With fdupes

Today’s article has you cleaning up your storage space as you learn to find and remove duplicate files with fdupes. This isn’t something you need to do often and it’s something that could theoretically break your system. If you’re going to remove duplicate files, it’s a good idea to exercise some caution.

I’ve previously shared another way to remove duplicate files:

Find And Remove Duplicate Files With rdfind

I suppose it’s pretty obvious as to why one might want to remove duplicate files. You do so to keep your storage tidy and you do so to make space when space is limited. There are all sorts of ways to make free space and removing duplicate files is just one of them.

The tool we’ll be using this time around is known as ‘fdupes’ and the man page describes it like this:

fdupes – finds duplicate files in a given set of directories

It’s an easy enough application to install and this article shouldn’t be all that long. You’ll find that fdupes is available in your default repositories, or there’s a good chance that it is in there. Of course, this means it’s easy enough to install.

Installing fdupes:

You can install fdupes with a GUI and your software manager, but you can just as easily install it via the terminal. We’ll cover the latter, as it’s the most universal (and, I find, quickest) method. Of course, you’ll need an open terminal. In most cases you can just press CTRL + ALT + T and your default terminal should open.

With your terminal now open, let’s go ahead and install fdupes:

Debian/Ubuntu/derivatives:

sudo apt install fdupes

SUSE/OpenSUSE/derivatives:

sudo zypper install fdupes

RHEL/Fedora/Rocky/derivatives:

sudo yum install fdupes

Arch/Manjaro/derivatives:

sudo pacman -S fdupes

Gentoo/Calculate/derivatives:

sudo emerge -a sys-apps/fdupes

And More! (Just search your default repositories for ‘fdupes’ and it’ll almost certainly be there.)

As you can see, you’ll find that fdupes is available for pretty much every Linux system out there. Not only is it available, it’s already packaged for you and easy enough to install via the terminal. On top of that, it doesn’t take much space to install fdupes, a mere 110 kB or so.

NOTE: I do not have anything against GUI tools. While I have terminals open at all times, I do the majority of my computer interaction in a browser – specifically a GUI browser. I suggest and write about the terminal because it’s more universal. It’s also often faster, assuming you can type or at least cut and paste than it is to go mucking about with a GUI software installer.

Anyhow, it’s nice and easy to remove duplicate files with fdupes. This article is going to show you how – and the article shouldn’t even be that long! It’s pretty simple.

Remove Duplicate Files With fdupes:

I’m going to assume that you left your terminal open after installing fdupes. If you didn’t, you’ll need to open it again. The only way to run the fdupes application is in the terminal. So, even if you installed it with a GUI, it’s a CLI tool and you’ll need the command line to use it.

The basic syntax is pretty easy, and not entirely unlike rdfind. For example, if you want to find duplicate files, you simply run this command:

fdupes <dir>

So, if you wanted to find duplicates in your home directory, you’d run this:

fdupes ~/

Don’t worry, you’re safe running that command. That won’t delete anything at all. That fdupes command will simply show you the duplicate files that it found.

If you want to run the fdupes command recursively, that is to check all the folders within the directory, you’d run the command like this:

fdupes -r <dir>

If you want to calculate the size of the files that would be removed when removing the duplicates, the command is just this:

fdupes -S <dir>

A summary is also available with this command:

fdupes -m <dir>

You can also search multiple directories for duplicates. That’d be something like this:

fdupes <dir> <dir1> <dir3>

Of course, you can run those commands together to get quite a bit of customization. They’re all reasonably harmless and will simply point out the duplicates as well as some meta information. You can then remove the duplicates by hand if you want.

You can also tell fdupes to remove the duplicates that it found. You’d never want to run this command without knowing what exactly is going to be removed, so don’t do that. Always check to make sure you’re not removing anything of value before automatically removing duplicates.

Fortunately, there’s a bit of a stopgap. You can run the following command and fdupes will ask for confirmation before removing the file:

fdupes -d <dir>

If you want to go whole-hog and remove every duplicate found, and remove the files without any confirmation, you can run this command:

fdupes -dN <dir>

Of course, that’s just the basics. If you want to know more about fdupes, simply check the man page (man fdupes) for more information.

Closure:

Hmm… I think I need to clean or replace this keyboard. The colon key is sticking on me. It’s a bit of a pain in the butt.

Anyhow, if you’ve ever wanted to remove duplicate files with fdupes, you now have directions to do so. This being Linux, you have all sorts of options when it comes to removing duplicate files, though I again urge caution when doing so. If you were to run this on the root directory, you’d likely find a lot of duplicates, and removing them might break your system. So, be careful with these tools, as they’re pretty powerful.

I’m not yet out of ideas for articles but it’d be great if folks might suggest something they’d like to read about. You never know, it might be something I know about. As it is, I have to search this site before writing an article, or else I’d end up with even more duplicate articles. I don’t want that and you don’t want that. It’s not all that easy to keep up this pace, writing a new article every other day. I’ve managed so far, but I’m eventually going to miss a day or two. It’s going to happen.

As always…

Thanks for reading! If you want to help, or if the site has helped you, you can donate, register to help, write an article, or buy inexpensive hosting to start your site. If you scroll down, you can sign up for the newsletter, vote for the article, and comment.

KGIII

Retired mathematician, residing in the mountains of Maine. I may be old and wise, but I am not infallible. Please point out any errors. And, as always, thanks again for reading.

Recent Posts

Enable/Disable Your Network Interface

Today we'll cover one way to enable or disable your network interface in the Linux…

6 months ago

Check Your NIC Speed In The Terminal

Today's exercise is a nice and simple exercise where we check your NIC speed in…

6 months ago

Easily Monitor Your Wireless Connection

Have you ever wanted to easily monitor your wireless connection? Well, now you can learn…

6 months ago

Count The Files In A Directory

I think I've covered this before with the ls command but this time we'll count…

6 months ago

Get System Information With The ‘uname’ Command In Linux

Today we'll be learning about a basic Linux command that's known as 'uname' and it…

6 months ago

hardinfo Has Been Rebooted As hardinfo2

If you've used hardinfo in the past, it may interest you to know that hardinfo…

6 months ago