pictureswap dot org /success
refresh for new pic
I wrote this stupid thing in Java in like 5 mins, getting a shit ton load of pics. It's running since yesterday and I had more than 3GB downloaded.
It took me the whole day to sort good/bad pics.
I'm so fucking exhausted and my folder is full again.
btw uploading folder to mega right now
Just noticed some troll pics slipped into the folder, were probably added while I was checking them, is still ok or do i have to double check before ? would take like 10 more minutes
yes that's what it is; want to keep the original filenames to be able to only post the website link to people, but files are deleted from server after 12 hours
Download eclipse and compile, you'll need jsoup external jar file.
You're welcome anon
Also checking my dubs
This should work in bash if you have wget and curl installed...
wget pictureswap.org/$(curl "http://pictureswap.org/success" 2> /dev/null| grep "download" | grep -v "glyph" | cut -d'"' -f2)
Right click > New > Java Project
Right click on project > New > package, name it picswapgrab
Right click on package > New > Class
Empty everything and paste my code
Right click on project > Properties
Click on Java Build Path on left
Click add external jars
browse to jsoup.jar; you can find it here : https://jsoup.org/
Click apply, then Ok
Now save all (icon on top left of the program)
Then right click on projet > refresh
Now click File > Export ...
Double click "Java" folder, choose Runnable JAR file > Next
Launch Configuration : [Project Name] - [Class Name]
Export destination is where you want to save the file
Now close eclipse, browse to the folder where you exported the jar file
Create a batch file and enter this :
java -jar [Filename].jar
Where [Filename] is the file you just compiled
>Geez anon I don't know what you're thinking but I can't give you a reward
I've never been so bored
fgts.jp is down so it's shit to browse /b/ archives down there, so forget about it, other archives suck
Yeah, the only other b archive I know of is random archive but it doesnt save webms and has a shit search. Apparently its down because of the software they use to save the images but other archives use it aswell which is weird..