/hydrus/ - Hydrus Network

Archive for bug reports, feature requests, and other discussion for the hydrus network.

Index Catalog Archive Bottom Refresh
Name
Options
Subject
Message

Max message length: 12000

files

Max file size: 32.00 MB

Total max file size: 50.00 MB

Max files: 5

Supported file types: GIF, JPG, PNG, WebM, OGG, and more

E-mail
Password

(used to delete files and posts)

Misc

Remember to follow the Rules

The backup domains are located at 8chan.se and 8chan.cc. TOR access can be found here, or you can access the TOR portal from the clearnet at Redchannit 3.0.

US Election Thread

8chan.moe is a hobby project with no affiliation whatsoever to the administration of any other "8chan" site, past or present.

(8.89 KB 480x360 zlp2VEUTTeI.jpg)

Version 329 hydrus_dev 11/07/2018 (Wed) 23:44:06 Id: e517db No. 10573
https://www.youtube.com/watch?v=zlp2VEUTTeI windows zip: https://github.com/hydrusnetwork/hydrus/releases/download/v329/Hydrus.Network.329.-.Windows.-.Extract.only.zip exe: https://github.com/hydrusnetwork/hydrus/releases/download/v329/Hydrus.Network.329.-.Windows.-.Installer.exe os x app: https://github.com/hydrusnetwork/hydrus/releases/download/v329/Hydrus.Network.329.-.OS.X.-.App.dmg tar.gz: https://github.com/hydrusnetwork/hydrus/releases/download/v329/Hydrus.Network.329.-.OS.X.-.Extract.only.tar.gz linux tar.gz: https://github.com/hydrusnetwork/hydrus/releases/download/v329/Hydrus.Network.329.-.Linux.-.Executable.tar.gz source tar.gz: https://github.com/hydrusnetwork/hydrus/archive/v329.tar.gz I had a great week. The login manager is now ready for everyone, and I did some neat other stuff as well. login manager I have fixed up how the login manager deals with various errors and am happy for everyone to give it a go. This release also introduces login scripts for Deviant Art, Danbooru, and the Gelbooru sites. >Please check it all out under network->downloaders->READY FOR CAUTIOUS USE: manage logins. This ui needs a bit more work to make it user-friendly, but you basically enter some credentials (username & password) for a domain and then 'activate' the login script for it. Thereafter, the client will try to log in to that site whenever it needs something from it and doesn't think it is logged in. It will make a little popup window while it logs in just to let you how it is doing. If it fails, it will 'block' that domain, and any network jobs will hang on it and any subs will pause on it. Most logins last from 30 days to five years, so after the initial success, you likely won't see it ever again, particularly if you use subscriptions. It seems to work ok! That said, feedback would be appreciated, particularly for the individual scripts. If you discover the client does not notice when it fails to log in, or fails to notice when it has naturally been logged out, or keeps reattempting a subscription that has an invalid login or something, please let me know. Caveats: As it says on the dialog: With hydrus, use throwaway accounts or accounts you otherwise do not care much about. The credentials are stored in plaintext, and if something goes wrong with the login system or you binge one night and download too much, you don't want to get a black mark on an account you care about. The deviant art login page only seems to work if your client has some history of downloading from Deviant Art previously. Otherwise, it puts up a 'Are you a robot?' captcha challenge, which this first login manager cannot deal with. If you have been downloading from Deviant Art for a while, you should be fine (otherwise, try downloading a few SFW files first and then try activating the login). The DA parser is also updated to correctly download NSFW images (which you will have access to once logged in). I don't know when this happened, but it seems that Pixiv now also allow guests to access SFW images. This is a neat development, but it means my login script as-is doesn't detect when it fails to log in (since pixiv now give you a 'guest' login regardless). If you have had trouble getting NSFW pixiv images and can't figure it out, please check your credentials are all correct and try deleting your cookies on the login dialog to try again. I will revisit the script next week to see if I can better detect a failed login. I looked at FurAffinity this week, since it has been highly requested, but found they require a captcha even on regular login attempts (and not, say, just sign-ups). This throws a wrench in our plans, as this first version cannot deal with captcha. Unless there is a simpler alternate login system, there is likely no good solution here for now, and FA users will have to copy cookies across from their browser. (which I still intend to make more user-friendly and reliable in the coming weeks) I had hoped to roll out a Hentai Foundry login this week, but it turns out their login cookies is dynamically named, so I will have to tweak the login manager to deal with it. Not a huge issue, but it will be delayed a little. The click-through guest login is now working entirely on the new system and appears to be fine. other stuff Texts in the program that ellipsize (Like this…) when they are too short now present tooltips with the full text. This includes the status on the network job control, which will help you review any long login errors. The manage tags dialog's buttons get a bit more tweaking this week: both the remove and copy buttons now work differently depending on whether any tags above are selected or not: if tags are selected, only those will be affected; if none are selected, all will be affected. Also, the remove button is now wrapped in a yes/no dialog that can be turned off in the cog menu. The filename tagging options panel now allows you to additionally tag files by their last, second last, and third last directory names! The export files panel now allows you to delete the files from the client after export. This checkbox value is remembered and presents a WARNING yes/no dialog on export just to make sure. Manage tag siblings and parents now handle large 'add' batches as one transaction, whether that is by a big .txt/clipboard import or when adding multiple parents. Everything is now bundled into one big list and you are only asked for the pend 'reason' once. full list - login: - the login manager is fully turned on! hentai-foundry click-through and pixiv login now occur fully on the new system
[Expand Post]- wrote a Deviant Art login script for NSFW downloading–however, it only seems to work on a client that has done some logged-out downloading first (otherwise it thinks you are a robot) - updated the DA file page parser to only NSFW-veto if the user is currently logged out - wrote a danbooru login script for user prefs and special files if you have a gold account - wrote a gelbooru 0.2.x login script for user prefs - pixiv recently(?) allowed non-logged in users to see sfw content, so the login script is updated to reflect this. the login script doesn't detect a failed login any more, so I will revisit this - logging in in the regular order of things now makes a temporary popup message with the overall login status and final result. ~it is cancellable~–and if cancelled, future login attempts will be delayed - logging in in the regular order of things now prints simple started/result lines to the log - deleted old network->login menu and related code such as the custom pixiv login management. gdpr click-through is now under downloaders - subscription login errors will now specify the given login failure reason - subscription login tests will now occur at a better time, guaranteeing the sub will be correctly saved paused if the test fails - login errors will now always specify the domain for which they failed - testing a login script on a fresh edit login script dialog now pre-fills the alphabetically first example domain - the login script test ui now restores its 'run test' button correctly if the test is abandoned early - misc improvements to login error handling and reporting - . - other: - any texts across the program that ellipsize when they are too thin to display what they have will now tooltip their text (this most importantly includes the status on the network job control, which will now display full login problem info) - the copy button on manage tags goes back to copying all if no tags are selected - the remove button on manage tags now removes only selected if some tags are selected. it still removes all if none are selected - the remove button on manage tags is now wrapped in a yes/no dialog (as is hitting the delete key on the list's selection). this can be turned off under the cog button - filename tagging panels now support directory tagging for the last, second last, and third last directories. the related code for handling directory tagging is cleaned up significantly - the export files panel now lets you delete the files from the client after export. this value will be remembered, and if on will prompt a capital letters warning on export, either via the button or the quick-export shortcut - in manage tag parents, where there are multiple parents in a pending action (either by importing via clipboard/file or by putting multiple parents in right-hand box), the action will now be treated as one transaction with one 'enter a reason' confirmation! - in manage tag siblings, when multiple 'better' values are pended in one action via a clipboard/file import, they will now be treated as one transaction with one 'enter a reason' confirmation! - . - misc: - added a new url class that api-links .gifv-style imgur links so they are downloadable like regular imgur single media pages - the pixiv manga page url class now redirects to the new api, so mode=manga pages should now be drag-and-drop importable and generally downloadable if you have any still hanging around in any queues - clients now come with an additional danbooru parser that fetches the webm version of ugoiras - after discovering a pdf that ate indefinite 100% CPU while trying to parse, I have decided to stop pulling num_words for pdfs. it was always a super inaccurate number, so let's wait for a better solution at a later date. hydrus hence no longer requires pypdf2 - fixed an issue with monthly bandwidth estimates rolling over to the new year incorrectly - in an attempt to chase down a duplicate files content move/copy bug, the duplicate action content updates got a bit of cleanup work. if you have noticed duplicate actions not copying tags/urls, please let me know the exact process in the ui, including services and merge options, you went through - tag lists should now update their sibling appearance correctly after a tag siblings dialog ok–previously, they were checking for new sibs too early - tag siblings and parents should now refresh their data more efficiently when spammed with new data notifications (this usually happens janitor-side, which approving dozens at once) - copy queries/watcher urls on the download pages' lists' right-click menus no longer double-spaces the copied texts (it just does single spaces) - fixed an issue where certain initialised watchers were erroring out when asked to provide next-check time estimates–in all cases, null timestamps will be dealt with better here - misc tag parents/siblings ui code cleanup - wrote some code to catch and report on an unusual dialog dismissal error next week I'll do a bit more login manager work, figuring out easy solutions for the difficult cases where I can and rolling out more scripts, and otherwise catch up with small work. I have five weeks left to make a nice 'final' python2 release before I take my holiday break to go up to python3.
First for plugin system/API please. Thanks for this release! PS: prkc is iq89 :^)
(23.55 KB 383x196 Untitled.jpg)

There's still an issue with the pixiv popup disappearing when the subscription sync finishes. It takes the "show X files" button with it, and it resets for each artist/query instead of showing a final "pixiv" "show XX files" button
(11.83 KB 320x169 34635457.jpg)

The Deviant Art downloader seems to work so far. One small annoyance with it though, mostly with how DA works. I noticed that it uses the Download button when its available which isn't useful sometimes as uploaders would compress their images in zip/rar so Hydrus pick those up and you sometimes have 1 image being compressed in a zip file. Another thing is artist tend to upload anything but images in a compressed file that hydrus also picks up, to the Download button. Like modders for example. https://www.deviantart.com/segadordelinks/art/SFV-Mod-Chun-Li-June-Edit-770750958 Where you still have an image but Download button is for the mod so hydrus picks up the mod insted of the image.
I have a question, with login, Is it possible for the program to check before it starts up if its logged in? Either by 1)going to a default page and looking for the sign in page 2)going to a default page and looking for a log out page 3)going to a default page and looking for the absence of either of the above 4)going to a specific page you can't access without being logged in here, the main page for derpi has a few https://derpibooru.org/users/sign_in That will tell you you aren't signed in if its present, and it will tell you if you are signed in on its absence https://derpibooru.org/settings Is a page that radically changes if you are logged in because some things are handled internally and some are handled through just a cookie. not the best way to do it for this site as many others require a log in or give you a full error for this https://derpibooru.org/users/sign_out The presence of this will tell you that you are logged in, and its absence can tell you that you are logged out. It would also be possible to have a test image on a site like pixiv that is either visible when logged in but not visible when logged out. now as to the furaffinity problem, I have a solution of sorts, though it may not be the best one it would likely be the easiest one for the end user. would it be possible to make a browser script to capture the login credentials and an import of the file it saves to hydrus? something where it just saves it to disk, and an input credential menu for hydrus. Just a few thoughts so far, no idea how implementable the last one is.
In "manage logins" I wrote my login on the wrong domain and now I can't clear it. There should be a way to do that.
The browsing experience would be much improved if the switch_between_100_percent_and_canvas_zoom was persistent until you close the full screen viewer. I was going through a search for very low resolution images to delete and I wanted them to remain unzoomed so I could make a judgement while for normal browsing I want them to fit my screen. As it is now I had to toggle each time I go to the next image.
wow, forgot to look in on the last thread for a while >>10558 >>10514 on the thumbnails, I upgraded version and I don't see the issue anymore, it could have needed a reset of the program, or something changed in the new version that fixed it for me, if it happens again, ill screenshot the whole program and post it. as for the client lag, sure, ill run profiles in a day or so when I put in a massive load of images, doing nothing right now my client 'cpu busy' a good chunk of that is likely thread watchers, along with getting a bit lazy about dealing with incoming files. A few things happened that desitated a 50k image import that I haven't gave a cursory look at yet, so that's gumming it up a bit just because of the image load.
>>10576 I expect to put the poll up this Wed, btw.
>>10580 Thank you for this report. Can you please go in to edit this sub and see what the 'presentation' options are down the bottom? Does it say there to 'publish new files to a popup button', or has this been unchecked (maybe by accident)? If it is checked, what do the 'presentation options' look like under 'file import options'–which kinds of files are set to present?
>>10581 Thank you for this. Another user mentioned this to me as well. I will make sure to update the parser to use the better link when available. Please let me know how it works for you.
>>10586 Thanks, this is by trying to edit it and clear the text boxes, right? For now, you should be able to delete/add to force refresh/clear it.
>>10584 Yeah, I think this kind of thing may be needed for some types of sites that provide bad cookies. "If this state is valid on this page, you can assume you are still logged in for 60 mins" kind of thing. I still have to play with Hentai Foundry to see if the dynamic cookie is enough. I don't think I have time to do anything like this well in this iteration, however. Can you describe your idea to get around FA a bit more? Atm, the problem on my end is that along with username and password field in their login form, they also need some recaptcha fields. In hydrus, this would mean fetching a recaptcha key and presenting the associated challenge/puzzle to the user in hydrus ui so they could solve it and so I could POST the key and answer back along with username/password. Do you have in mind some way of connecting the browser to hydrus to help out?
>>10587 I agree. My idea on this is to add it as a cog icon option somewhere in the media viewer ui. I'll bump the job up.
>>10619 yeah, trying to clear the text box
(111.20 KB 588x1933 error.png)

Hmm, it starts up without problems the first time, I close it, open it again and then it gives me these errors. Wtf? (I recently updated 11 versions, but it might not be this, as I had this problem before. What is/could be going on?) Welp, I just deleted everything and am just gonna start over and update regularily this time, but I would like to know what it is regardless. (And it might not even have been my updating, as I had errors before this that were very similar too.) Thanks at any rate, you are doing an awseome job.
>>10626 The 'db is locked' issue seems to occur in some combination of 'a client is already running' and 'the last client shutdown was bad'. I have had a couple of reports about this but haven't been about to reproduce it or otherwise figure it out. It _may_ be related to a hard drive hiccup, but I don't have any firm evidence. The solution seems to be checking for and force-killing any existing client.exe in task manager and otherwise just trying to boot a couple of times. If your client booted ok afterwards, I would recommend you stick with it, as this issue so far seems otherwise harmless. If you still have your old install, can you check your client.log in install_dir/db? The errors should have been written to that log. If they aren't just reiterations of 'db was locked', I'd be interested to see them.
>>10628 Woah, thanks for the answer. I simply took it back out of the trash, but I think I have to re-organize anyways. It's a welcome excuse, haha. I hope you can trust anonfile, I uploaded the raw txt files: https://anonfile.com/meTaV1kebe/client_-_2018-11_log https://anonfile.com/n5TbV9k1b0/client_-_2018-6_log https://anonfile.com/ocT7V4keb6/client_-_2018-7_log https://anonfile.com/pdTbV4kbb6/client_-_2018-9_log The first link takes you to the log of today when the error occurred, the others are just there if you can be bothered to look into them, but in the end it's not really important I think. At any rate, thank you for replying, I almost feel bad for taking your time up to respond since the error kind of "resokved itself" (it still wörks) afterwards. Have a nice day. I salute your effort for this program.
(26.75 KB 810x572 Unbenannt.png)

>>10630 >>10628 Sorry that I ask another thing yet again, but I am a little illiterate concerning technology: What am I doing wrong that I have three (3) different "Public Tag Repositories"? Image related.
>>10630 Thanks, I have these logs now. You can delete the links if you wish. Please don't worry about reporting errors–my going through reports and figuring out better solutions is the main way the program gets better, whether that is figuring out a technical fix or simply rewording an error message. I appreciate your feedback. >>10631 I am not sure what happened here–did you happen to go through the help->i don't know what I am doing menu more than once? This could have created multiple entries here. MAKE A BACKUP OF YOUR DB FILES BEFORE YOU DO THE FOLLOWING: I suggest you look at these three services and make a note of which seems the most 'healthy' and then go services->manage services and delete the two bad ones. If you have multiples of my file repository, you can delete two or all three of those as well. Let me know if you get any errors trying to delete the extra services.
>>10617 Not sure how it got unchecked but that did the trick. Thanks Dev
>>10620 No real way to connect the browser to hydrus at least that I know of, my idea was logging in on chrome and having it save the login info to something that could be opened to hydrus. I have no idea how feasible my idea is, or if it's more work then what you are thinking will be.
>>10573 Hi admin Can you reply to my message on your email I actually sent you an email with all questions Also sorry if I disturb you <3
>>10573 I have a question, as you may remember I have an issue with one of my subs, mainly the one to derpy first_seen_at.gt:3 days ago, upvotes.gte:150 Not going through all of the pages due to the weird way that it displays. you mentioned a potential solution a few threads back, did that solution get put in and i'm an idiot and didn't see it or was it never implemented? Just a refresher, sorts by time uploaded, the problem is something from 2 days ago 12 hours may just bass a 150 upvote threshold and be incorporated, and it's now on page 14 but the sub stops trying at page 1-5 because it sees to much it already knows. The solution I thought of would be an ignore it toggle, possibly hidden by default as really, this is likely the only search that will run into problems where a constant resetting the sub is needed. You thought of one where a custom threshold could be met according to needs, where I could set it to whatever is needed to make sure it hits everything.
>>10635 Hey, should be sent. I am afraid I am a bit slow keeping up with email, but I always try to keep it within a week. Same with threads tbh, ha ha. >>10637 I am not totally sure. I fiddled with the way the 'you hit the periodic limit' warnings work recently, so syncs now push on if the last five files in a page were not all already-seen and the popup warnings now only appear if all the files in a single sync were new. I am not sure how this would exactly play for your situation, so if you have had it paused for a few weeks, I recommend you just turn it on and see what happens. If it is still not checking deeper because your search commonly has the last five files of an early are all already-seen, let me know, and I'll see about adding an advanced option to modify the five files check. I'd have to be careful about it, though, and certainly hide it behind advanced mode and maybe some CAREFUL M8 dialog. I don't want to give users the power to set up a sub that checks a hundred pages every five minutes by accident. Forgive me if I misunderstand what first_seen_at means, but could you cheese your way through this by setting a specific check time of exactly three days? Would that guarantee that every file was new? Or if you set it at a fixed 2 or 2.5 days, would that almost always guarantee the search would go deeper due to the new file quantity, but still guarantee some already-seens so you don't get the periodic limit popup?
>>10640 the first seen 3 days ago is a hold over from their own filter that sorts it by number of upvotes, activity > rankings > top scoring which filters everything But you saying that, I could make a very exacting list for much of the range, I didn't consider a less than greater then combo. Ill try this out and see how it goes for a few days having go in tandem with the current filter and resets… granted I will need to keep track of the numbers out of program, but that could work.
>>10640 >>10642 Ok going to make a sub that is specifically this first_seen_at.gt:3 days ago,upvotes.gte:150,upvotes.lte:153 first_seen_at.gt:3 days ago,upvotes.gte:154,upvotes.lte:156 first_seen_at.gt:3 days ago,upvotes.gte:157,upvotes.lte:159 first_seen_at.gt:3 days ago,upvotes.gte:160,upvotes.lte:162 first_seen_at.gt:3 days ago,upvotes.gte:163,upvotes.lte:165 first_seen_at.gt:3 days ago,upvotes.gte:166,upvotes.lte:169 first_seen_at.gt:3 days ago,upvotes.gte:170,upvotes.lte:172 first_seen_at.gt:3 days ago,upvotes.gte:173,upvotes.lte:175 first_seen_at.gt:3 days ago,upvotes.gte:176,upvotes.lte:180 first_seen_at.gt:3 days ago,upvotes.gte:181,upvotes.lte:185 first_seen_at.gt:3 days ago,upvotes.gte:186,upvotes.lte:190 first_seen_at.gt:3 days ago,upvotes.gte:191,upvotes.lte:195 first_seen_at.gt:3 days ago,upvotes.gte:196,upvotes.lte:200 first_seen_at.gt:3 days ago,upvotes.gte:201,upvotes.lte:205 first_seen_at.gt:3 days ago,upvotes.gte:206,upvotes.lte:215 first_seen_at.gt:3 days ago,upvotes.gte:216,upvotes.lte:225 first_seen_at.gt:3 days ago,upvotes.gte:226,upvotes.lte:235 first_seen_at.gt:3 days ago,upvotes.gte:236,upvotes.lte:245 first_seen_at.gt:3 days ago,upvotes.gte:246,upvotes.lte:265 first_seen_at.gt:3 days ago,upvotes.gte:266,upvotes.lte:285 first_seen_at.gt:3 days ago,upvotes.gte:286,upvotes.lte:305 first_seen_at.gt:3 days ago,upvotes.gte:306,upvotes.lte:345 first_seen_at.gt:3 days ago,upvotes.gte:346,upvotes.lte:385 first_seen_at.gt:3 days ago,upvotes.gte:386,upvotes.lte:465 first_seen_at.gt:3 days ago,upvotes.gte:466,upvotes.lte:545 first_seen_at.gt:3 days ago,upvotes.gte:546 This amount of consolidation should make it near impossible to miss a file as everyone is going to be sub 15, and even in the case it goes over 15, like is with one of the 200 range searches, checking every 6 hours should catch it at one of the stages Also, whoever thought of the paste tags for subs is a fucking genius.
I just noticed this in the right click menu, "show all watchers' files in new page" so, two things 1) watchers' should probably be watcher's 2) can you add a 'show all watchers new files in new page' to the right click menu? this would solve one of my bigger minor issues
>>10644 > watchers' should probably be watcher's watchers' here stand the for files of the watchers. In English this is correct. > new files What is a new file?
(59.95 KB 1128x698 client_2018-11-14_03-47-15.png)

bit of an issue with how I set the downloaders, I did not anticipate that it would come back with a dead status, does this mean it wont check going forward and needs babysitting, or is this something that will be solved going forward with a subsequent subscription check? If it's going to fail when a range sees no images, but I can't expand the range to much… at least till a derpy login is made which will hopefully have 50 a page… because at 15 images a page it says its has caught up to often, I am at a bit of a loss as to how to proceed. The way this looks to me is it wont retry next check, I could be wrong, but is there a way to force it to never not look at a sub?
>>10642 Your solution in >>10643 looks maybe a bit cumbersome and autistic to me, but if you find it works and doesn't annoy the server you are talking to (I imagine generating those queries is a little CPU expensive on their end), then that could be a solution. I might tune those ranges to be a bit broader, with maybe only five 'bands' of upvotes. The DEAD result in >>10646 is an artifact of that particular band not happening to have files on its first check, which is related to how thin it is. I recommend you wait a couple days and tell that single query to reset/check_now and see if you can initialise it with some files. As long as your check options for this sub are sufficiently broad, like 'only die if less than one new file in six months', I shouldn't think this kind of ratings based feed work ever die (after the first good initialisation) as long as the site is active. >>10644 Do you mean like 'the files that appeared since I looked last'? Unfortunately, watchers don't keep those sorts of records, so they only ever reproduce what their file import options have set in their 'presentation/publish' settings. For the apostrophe, I meant that as the plural of watchers, as >>10645 says, since it works for multi-selection. It doesn't fix itself if you only have one watcher selected. There are a bunch of places around the program where I lazily use the plural label like this even when currently only one thing is selected, like 'do action x to this 1 files', because the finicky text alteration every time for a singular selection is a pain in the ass. I eagerly await the AI takeover that will force all humans to standardise to One Perfect Language that has a single letter for pluralisation and a library I can import to handle it automatically.


Forms
Delete
Report
Quick Reply