4chan Downloader for Ubuntu

You all know it, love or hate it.

Here is a script that automatically downloads all the pictures of a given board. It goes through all the threads and saves the pictures in the same folder the script file is in.

My particular script checks for duplicate (only name) files and skips them, so that you don’t download 100 times the same lame picture.
It lets you choose the board you want to download from and tells you on which site and in which thread is currently is downloading. It also tells you how big the image it just downloaded is and how much time it did use to do just that.
At the end of every thread, it shows you how many pictures it has downloaded (doesn’t count the skipped ones) and show you how many kb you already have wasted.

Here is how to do it:

create a new folder where you can download the pictures. Keep in mind that when you leave the program running for an hour, it is possible to get up to 500mb worth in pictures. I created a folder for every board I want to download pictures from.

Open a Terminal windows and navigate to that folder.


gedit 4chandownloader

A new window will open and you can paste the whole code here into it.


echo -e "Choose your Board: \nb for random,\nr for request, \nwg for wallpapers, \np for photography and so on"
read BOARD
USERAGENT="Mozilla/5.0 (X11; U; Linux i686; en-US; rv: Gecko/20100214 Ubuntu/9.10 (karmic) Firefox/3.5.8"

fetch() { wget -U "${USERAGENT}" -e robots=off -q "${@}";  }

echo "Board: $BOARD"
sum=0			#counter for kb fetched
picCount=0		#counter for number of pictures fetched
for ((i=0; i<=10; i++)) do			#loop for page number
	echo "Page ${i}"
		for thread in $(fetch -O- "http://cgi.4chan.org/$BOARD/${i}.html" | sed -re '/value=delete.*filetitle/ { s/[^0-9]//g; p };d'); do
			echo -e "Page ${i} \nThread ${thread}"
			echo -e "Already fetched $picCount pictures and $sum kb"
		for image in $(fetch -O- "http://cgi.4chan.org/${BOARD}/res/${thread}.html" | sed -re 's/.*<a href="(http:..images.4chan.org.[^"]+)".*/\1/p; d'); do
			if [ -f ${image##*/} ]				#checks if picture already exists in folder, image##*/ outputs only picture number+extension
					echo -e "\t ${image##*/} exists, skipping"
					startTime=$(date +%s)		#variable for counting time of download
					echo -e "\t ${image##*/ }"	#echos picture, number with extension
					fetch ${image}
					endTime=$(date +%s)
					diffTime=$(( $endTime - $startTime ))			#calculates time for operation
					FILESIZE=$(($(stat -c%s "${image##*/}")/1024))		#gets filesize on harddisk, outputs kb
					echo -e "\t Fetched $FILESIZE kb in $diffTime seconds"
					sum=$(($sum+$FILESIZE))					#counter for total fetched data
					picCount=$(($picCount+1))				#counter for fetched pictures

Save the file, close the gedit window.

Now, type

chmod 777 4chandownloader

You are ready to download all the pictures.
The command to run the script is:


To stop, press CTRL + C.

I will post the same script, but with a modification so that it directly shows the pictures you just have downloaded, like a slideshow.


One Response to 4chan Downloader for Ubuntu

  1. hajimson says:

    it doesn’t work anymore
    all I get is in output:
    Page 0
    Page 1
    Page 2
    Page 3
    Page 4
    Page 5
    Page 6
    Page 7
    Page 8
    Page 9
    Page 10

    I guess there is someting worng with the URLs
    I tried to change ‘cgi.4chan.org’ to ‘boards.4chan.org’, but without any result. I don’t know linux scrpiting very well so I can’t repair that by myself.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: