/hydrus/ - Hydrus Network

Archive for bug reports, feature requests, and other discussion for the hydrus network.

Index Catalog Archive Bottom Refresh
Name
Options
Subject
Message

Max message length: 12000

files

Max file size: 32.00 MB

Total max file size: 50.00 MB

Max files: 5

Supported file types: GIF, JPG, PNG, WebM, OGG, and more

E-mail
Password

(used to delete files and posts)

Misc

Remember to follow the Rules

The backup domains are located at 8chan.se and 8chan.cc. TOR access can be found here, or you can access the TOR portal from the clearnet at Redchannit 3.0.

Uncommon Time Winter Stream

Interboard /christmas/ Event has Begun!
Come celebrate Christmas with us here


8chan.moe is a hobby project with no affiliation whatsoever to the administration of any other "8chan" site, past or present.

(8.37 KB 480x360 8bw5h-WPYBQ.jpg)

Version 319 hydrus_dev 08/22/2018 (Wed) 22:01:01 Id: ef894a No. 9757
https://www.youtube.com/watch?v=8bw5h-WPYBQ windows zip: https://github.com/hydrusnetwork/hydrus/releases/download/v319/Hydrus.Network.319.-.Windows.-.Extract.only.zip exe: https://github.com/hydrusnetwork/hydrus/releases/download/v319/Hydrus.Network.319.-.Windows.-.Installer.exe os x app: https://github.com/hydrusnetwork/hydrus/releases/download/v319/Hydrus.Network.319.-.OS.X.-.App.dmg tar.gz: https://github.com/hydrusnetwork/hydrus/releases/download/v319/Hydrus.Network.319.-.OS.X.-.Extract.only.tar.gz linux tar.gz: https://github.com/hydrusnetwork/hydrus/releases/download/v319/Hydrus.Network.319.-.Linux.-.Executable.tar.gz source tar.gz: https://github.com/hydrusnetwork/hydrus/archive/v319.tar.gz I had a mixed week, but I got some good work done. It is mostly improvements to the downloaders, and then some stuff for advanced users to play with. downloaders and urls Following the success of the new gallery page 'delay' system, where gallery page hits across the program are spread out over time, I have made several improvements: the system's logic is more reliable, it occurs on a per-domain basis, the waiting period and status updates now occur in the network engine itself (so will appear in the network job control ui–I expect to add some kind of 'override wait' action to the cog menu soon), and the thread watcher now also has a delay similar system, also under options->downloading. Also, the network engine will now not permit more than three simultaneous jobs per domain as a global bottleneck, and the overall total num connections is boosted as a result to 15 (from 10). In general, the network code should be just a bit more pleasant and less CPU-spiky while still serving a decent throughput when there is work to be done. The right-click->share->copy->urls menu on thumbnails and the media viewer is now much more powerful. For the focused file, it can now copy (all/all recognised) URLs, and for a thumbnail selection, it can copy (all/all of a specific URL Class) URLs. So, if you want to select a hundred files and get all their Deviant Art URLs in your clipboard real quick, it is now easy! Also, the client now does some additional normalisation of URLs (it alphabetises query strings on all URLs, not just those matched in the new system). It will take a minute or two to convert and merge your existing DB URLs on update. Due to some of the phases of network engine update and normalisation, some subscriptions have been hitting their 'periodic' limits by accident. If you have seen some of these, or get more this week, please do not freak out unless you get additional errors–it should all blow over in another sub cycle. searchers, or gugs as I like to call them I have finished the first version of the 'searcher' object. I have also decided to call it the Gallery URL Generator (GUG), as this more precisely describes what it does. Advanced users may like to check the new not-plugged-in dialog under network->downloader definitions->manage gallery url generators to see what I am going for. Let me know if there is anything you think this object needs that I have forgotten! Assuming this goes well, I'll plug this in next week and we'll start experimenting for real. full list - started the new convert-query-text-to-gallery-urls object. these objects, which I was thinking of calling 'Searchers', will be called the more specific and practical 'Gallery URL Generators', or GUGs for short - the first version of GUGs is done, and I've written some test ui for advanced users under network->downloader definitions->manage gugs. this ui doesn't save anything yet, but lets you mess around with different values. if we don't think of anything else needed in the next week, I will fix this code for v320 and start filling in defaults - watchers now have a global checking slot, much like the recent change to galleries and subs. it safely throttles dozens of threads so they don't rudely hammer your (or the destination server's) CPU if they all happen to want to go at once (like just after your computer wakes up). the option is similarly under options->downloading, and is global for the moment - moved the new gallery delay/token management code to the better-fit bandwidth manager (it was in domain manager before) - the gallery delay/token code now works per-domain! - moved the gallery delay/token checking code into the network job proper, simplifying a bunch of import-level code and making the text display now appear in the network job control. token consumption now occurs after bandwidth (it is now the last hoop to jump through, which reduces the chance of a pileup in unusual situations) I expect to soon add some kind of 'force-go' action to the cog menu - the network engine will now not permit more than three jobs active per domain, and the overall limit has been raised from ten to fifteen - the media right-click menu now supports copying: all of a files recognised urls; all of a files urls; all selected files' urls of a specific url class; and all selected files urls - reworked and harmonised a bunch of urlparsing and generation code–all urls should now appear as full unicode across the program, generally without %20-type encoding characters unless explicitly entered by the user. character encoding now all happens on the backend in requests - non-url-class-matched urls now have their query parameters alphabetised as part of the normalisation process - all urls in the db will have their query params alphabetised on update, and any file relationships merged to the new/existing normalised url - the manage urls dialog will now normalise newly added urls (but should also still permit the removal of non-normalised urls) - reworked how gallery hits update file import object caches, particularly for subscriptions - fixed an issue in subscriptions gallery logging where the gallery log would always state it had found the max number of files and typically redundantly generate an 'ignored' stub–it should now say something like 'found 7 files - saw 5 previously seen urls, so assuming we caught up' as originally intended - simplified some gallery->file import object creation - galleries now compact until 100 entries (was 25)
[Expand Post]- watchers now gallery-compact after a successful check - watchers now show the 'just added'/'already watching' status for 15s, up from 5s - network report mode now reports three time–once each for job addition, start, and successful completion - fixed an issue with the new 'max width' popup sizing calculation that was sometimes not fitting for new height requirements correctly - fixed an issue with the new url class next page generation code - fixed an issue where TIOs with data regarding since-deleted services were failing to initialise at the ui level - misc status text cleanup next week I want to finish all the remaining 'default' gallery parsers. If I can get them all out and get GUGs actually working ok, I might be able to finish off the rest of the GUGs for v321, auto-convert all the legacy boorus people have to the new system, and completely delete the old downloader code! Fingers crossed, we are absolutely in the end-game of the downloader overhaul now.
lol, at least it's going fast
Ok, i'm curious, stuff like this bugs me every time I restart I know before I asked if saving the orientation would be possible and you said it would take a lot of work, or something along those lines however instead of a user defined state, would it be possible to have the program look at what is there, and try and show all the info it can without '…' I imagine have it program handled, until the user moves something inside it, like makes the subject a bit bigger because that was getting lost then until a resize of the area, the program won't update how big an area is. as you can see by the screenshots I keep my side area wide to the point that the program should never really be truncating anything, Using the screen shot program it would take around 700 pixels wide to not truncate anything, about 500 wide if you truncate a bit of the subject and about 400 if you truncate the added as I was writing this, I just found out that there is a horizontal scroll to that window, I never needed to move anything over as everything fit by the time these watchers were added. Knowing this, would it be possible to have a truncated mode and untruncated mode option? On this topic, over the last few days I was doing things in the watchers and page of pages, they keep going from where I put them to the main windows to the far right before I thought this was due to the program locking up, but that doesn't seem to be the case, as the program every time it happens is perfectly useable. would it be possible to have a tab position lock of sorts so they don't move around? I'm honestly thinking that when I click between sometimes there may be a small drag something I don't notice but the program does and it moves a tab to a new position. Ill probably have a bit more to say on the way downloaders are handled later on, as I just finished grabbing things before I updated, will take some time before there are new things to grab
(13.32 KB 1081x549 client_2018-08-22_20-09-28.png)

>>9761 forgot the third image
(2.72 KB 240x160 Capture.png)

Hello, is it possible to choose the browser that Hydrus opens links in? Rather than just using my default browser in Windows.
>>9775 Yes in options > files and trash
Once you start a dupe search at a selected distance it's not possible to stop it despite there being a stop button because the entire program locks up and stops responding. :(
Got this error then the whole client froze wxAssertionError
C++ assertion "IsRunning()" failed at ..\..\src\common\evtloopcmn.cpp(83) in wxEventLoopBase::Exit(): Use ScheduleExit() on not running loop
File "include\ClientGUITopLevelWindows.py", line 479, in EventOK
self.DoOK()
File "include\ClientGUITopLevelWindows.py", line 667, in DoOK
self.EndModal( wx.ID_OK )
>>9762 >>9761 I am sorry for the awkwardness of the new listctrl. I'd like to make these sorts of improvements, but it will have to wait for the rewrite from the old listctrl (which crashed on sort, so we are getting better) to be finished first. >>9775 >>9779 Thanks, yeah–those media viewer hover window links don't use that option yet, but I will be fixing them in the coming weeks. The right-click->known urls->open menu does work with the option for now. >>9781 Thank you for this report. I will see if I can insert a better stop check in that. >>9782 And thank you for this as well. I have seen this a couple of times before but have not been able to pin it down. Can you remember what you were doing at the time? Had you recently closed a dialog, or were attempting to? Do you have a 'busy' client with say dozens of pages and downloaders all working at once?
(2.61 KB 281x320 2018-08-25_21-38-53.png)

(2.02 KB 172x249 client_2018-08-25_21-39-03.png)

Ok, got an issue. not sure if this would be solved by a restart or not, doing to much shit at the moment to really do that. one is normal but when mouse over it, it compacts. as for how the program handles imports to the watchers… its better now, its more useful but still chugs, however 2 threads that 404ed on 4chan before the program was able to pick them up, generally this only happens on /b/ because of thread lifespan, would it be possible to put an option in the watchers to handle as many threads as possible as fast as possible as an override/option for pretty much any other board i'm good with it adding slow, but /b/ needs it to go fast because the first page 10 and the last one are generally 2 minutes or less from going away.
What are the chances of ever getting an RSS-feed-like thing for certain imageboards that shows all threads containing certain user-set tags/strings which you can then quick-open a watcher for?
Hi, I'm having some issues with webms on this version. When trying to import: >Unparsable file when closing the import dialog after trying to import a webm the client crashes with >double free or corruption (out) >Aborted (core dumped) when trying to view a webm in the client >Exception >Unable to render that video! Please send it to hydrus dev so he can look at it! >Traceback (most recent call last): > File "include/ClientRendering.py", line 370, in THREADRender > File "include/HydrusVideoHandling.py", line 767, in read_frame >Exception: Unable to render that video! Please send it to hydrus dev so he can look at it! when trying to open externally: >mpv: relocation error: /usr/lib/libssh.so.4: symbol gcry_mpi_ec_decode_point version GCRYPT_1.6 not defined in file libgcrypt.so.20 with link time reference I posted before about issues with liblzma so maybe this is another similar library issue.
(2.13 KB 370x38 Untitled.png)

hi, im having a strange issue where some tags dont appear in the pend option. the "pending" option doesn't appear at all until i pend/petition a high number of tags, see this picture for an example, i have 111 tags to upload and 19 to petition, yet the pending shows only (34). This is a bit annoying because usually when i make a big import from a booru, before i start archiving, i do a cleanup process of removing useless tags like the tag "invalid tag", or misspelled tags, or retarded meme tags like "thicc" instead of "thick", and so on, sometimes i miss one or two, and i remove them in a small upload after. but now i cannot because, unless i request a high number of tags across multiple pictures the pending option will not appear. its like it doesn't recognize certain pend/petition requests and it wont show up until it finds one that it does recognize. i was told to reset the processing cache of the PTR but doing this only serves to freeze my hydrus client permanently until i kill the task from task manager.
>>9789 Thanks. The layout of the ratings bubbles here is actually a tricky one–the ones on the hover window are actual legit OS GDI objects that get laid out like any other button or text widget, but the ones on the media canvas are just shapes I draw on a bitmap using my own +2px pseudo-layout shit (which is also why they are un-clickable, if you ever manage to get your mouse on them before the hover window pops into place). Depending on some OS layout and dpi/zoom preferences and all that, these layout calculations can differ a bit, and I guess that is extreme in your case due to the number of rows you have. Which OS version are you on, and do you run 100% dpi zoom or something else?
>>9791 Perhaps in a future iteration on the downloader engine, but not this pass. I've completed most of this rewrite's objectives, which was to convert the legacy system to something user-editable and saner behind the scenes, and don't have time to write any new custom ui. If you don't have a workflow already, I recommend you set up a system where you check the catalogs of the boards you like every x time units and do some manual text searches or pick the top y threads you like in the first three pages and just drag and drop on the client. I am not familiar with the various imageboard add-ons for browsers–and I think there are other thread-watcher apps, although I don't remember the names–so maybe one of those can provide you somewhere with a subject/url summary list of the boards you like? Again, if you can just isolate the text and either copy/select, it can just be one more click to get the thread into hydrus. Or, looking at it, it look like you can just plug this rss into a reader and you might be off: https://8ch.net/hydrus/index.rss Or if your browser can render json nice: https://8ch.net/hydrus/0.json
>>9795 Something about PyInstaller–the program I use to freeze an executable python environment–is funky on some flavours of linux, and the cwd or shell environment or something just gets messed up and the program can't run/find other exes well. I am still trying to pin it down, but I am not a linux expert by any means. Since you are also getting crashes, I recommend you try running from source. There is more info here: >>9768 https://hydrusnetwork.github.io/hydrus/help/running_from_source.html
>>9799 Yeah, no hurry, I'm already using 4chan X to track individual threads and also pin/hide threads based on tag lists and blacklists, I was just wondering if it was in the cards down the road. If you do try 4chan X, do some searching around and testing, there are a variety of forks, some are dead or broken, and I haven't upgraded in a while for fear of breaking everything. I knew that 8ch had json, but I didn't know it had RSS feeds, so thanks for that. Now if people will just post on 8ch again…
>>9796 Thank you for this report. Every time I think I pin down one of these miscounts, another appears. Please try help->debug->data actions->clear db service info cache. This will force a clear-and-regen of those counts and should fix you up. This is probably a one-in-a-thousand random event, but if it keeps coming back for you, please let me know and we'll go deeper and try to figure out where the miscount is coming from. Thank you for petitioning these bad tags. I hope to roll out a tag filter for the PTR in the nearish future that will ban 'invalid tag' and 'artist request' and the other booru artifacts so we won't have to deal with them again.
>DA provides a 'next page' link using a meta 'link' tag with rel="next", but it is 404 invalid due to a typo on their end https://www.deviantart.com/shuubaru/gallery/?offset=24 -> https://www.deviantart.com/shuubaru/shuubaru/gallery/?offset=48
(710.24 KB 1200x924 1466741901564-5.jpg)

>>9802 oh my god thats all i had to do? thanks hydev, you solved it. >>9803 >a fucking typo breaks the site this is Aliens tier
>>9785 >Can you remember what you were doing at the time? Had you recently closed a dialog, or were attempting to? Do you have a 'busy' client with say dozens of pages and downloaders all working at once? I might be mistaken but I think I was closing a manage tags window. It's not a busy client. Less than 10 pages with few items in them, nothing downloading.
>>9809 I happened to get this error myself this week and think I figured it out. Please let me know if you have any more trouble in v320 on!
I found a cool bug in v318, but not sure what it is or if you already know about it. Here goes: >I had several multiwatcher tabs open, most idle but one downloading 12 or so threads at once >I had one simple downloader tab open, downloading from around fifteen threads from an off-site archive >also syncing tumblr subscriptions at same time >switch to the dupes tab and run the rebalancing, hadn't done it since before the new tumblr fuckery so I had I think 84 branches to rebalance >goes for a while, I'm tabbed out reading a thread and messing with jdown and stuff >tab back in, it's just finished, hit OK, instead of letting it sit a minute and then running dupe discovery, I immediately switch back to the working multiwatcher, but hydrus hangs as it tends to because my laptop sucks, especially after finishing big operations, a message box has popped up under the subscriptions box, but it's blank because the client froze there >after ten or twenty seconds it resolves itself and refreshes the screen, seems normal, except the subscriptions downloader box and the message box below it are just gone >figure it was just a message saying it thought it was done syncing for the day, not sure why they closed themselves though >elapse ten or twenty seconds of normal behavior >open up tags to enter in new tag for thread I'm about to paste into multiwatcher, enter them and remove last thread's unique tags, hit OK or Apply or whatever, box closes >client hangs again, this time doesn't recover, but rather than going (Not Responding) it just makes that annoying system bell whenever I try to click on any tab or anything at all >downloader pops back up and starts, figure it's fine and I can just wait a bit since it's actually downloading >stays frozen, can't select anything, but downloader still going and there's a rare C++ level error spat out at me instead of Python >running it in debug, so go to console window and look at it, shows basically same thing (forgot to screenshot the full error, sorry) >was able to select the program itself before from other front windows, but after selecting the debug console now can't select the GUI and the console stays up in front, still dinging >Ctrl+C from the console so it doesn't crash my computer when I try to close the GUI >restart, everything is completely normal and fine except the new tags weren't saved as entered in The error that popped up under the subscriptions box when it first froze up was: C++ assertion "IsRunning()" failed at ..\..\src\common\eventloopcmn.cpp(83) in wxEventLoopBase::Exit();
Use ScheduleExit() on not running loop
Doesn't really hurt anything and may not be duplicable on a non-potato, but I thought it was interesting.
>>9819 Thank you for this report. This looks the same as >>9782 . The 'modal' popup that comes up during idle maintenance and a couple of other places was being rude to any pre-existing dialogs. The dialog gets hidden but then can't restore when the maintenance job is over, and since the dialog is still modal, ui functionality can't return to the main ui frame. In your case, it was probably the tags window for your new multi-watcher. Maybe some idle job got delayed while the dupe search was going on, arrived late once your db-commit hang finished, and then blatted itself into the ui event loop right when you had that dialog open. I believe I have this fixed for today, but if you experience it again, particularly with the dupe search stuff–which works on slightly different logic–please let me know. I now know what the error is actually about!


Forms
Delete
Report
Quick Reply