https://www.youtube.com/watch?v=PXAlznKcJvA
windows
zip:
https://github.com/hydrusnetwork/hydrus/releases/download/v362/Hydrus.Network.362.-.Windows.-.Extract.only.zip
exe:
https://github.com/hydrusnetwork/hydrus/releases/download/v362/Hydrus.Network.362.-.Windows.-.Installer.exe
os x
app:
https://github.com/hydrusnetwork/hydrus/releases/download/v362/Hydrus.Network.362.-.OS.X.-.App.dmg
linux
tar.gz:
https://github.com/hydrusnetwork/hydrus/releases/download/v362/Hydrus.Network.362.-.Linux.-.Executable.tar.gz
source
tar.gz:
https://github.com/hydrusnetwork/hydrus/archive/v362.tar.gz
I had a mixed week. The duplicates overhaul work is finished.
duplicates work finished
The duplicates storage overhaul is done! Everything is now on the new storage system, the duplicate filter has had some quality of life attention, and now there is some updated help:
https://hydrusnetwork.github.io/hydrus/help/duplicates.html
If you have used hydrus for a bit but haven't checked out the duplicate system yet, this is a good time.
I added some final new things this week: the duplicates filter now highlights common resolutions, like 720p and 1080p, and there is a new 'comparison statement' for common ratios like 4:3 or 16:9. Also, the thumbnail right-click
file relationships menu now provides a choice to clear out potential relationships and, if in advanced mode, perform some
en masse remove/reset actions on multiple thumbnails at once.
I am now free to move on to another large job. Audio support was very popular at the last vote, so I will spend a couple of weeks trying to get some simple 'has audio' metadata going, but then I am going to address some growing issues related to tag repositories, easier 'I want these tags' management, namespace siblings, and multiple local tag services.
deepdanbooru plugin
If you are an experienced user and interested in testing out some neural net tagging, check this out:
https://gitgud.io/koto/hydrus-dd/
This project by a hydrus user lets you generate tags using the DeepDanbooru model and get them into hydrus in a variety of ways. If you give it a go, let me know how it goes and what I can do to make it work better on the hydrus end.
full list
- duplicates work finished:
- updated the duplicates help text and screenshots to reflect the new system
- duplicate files search tree rebalancing is now done automatically on the normal idle maintenance routine, and its over-technical UI is removed from the duplicates page
- the duplicate filter's resolution comparison statement now specifies 480p, 720p, 1080p, and 4k resolutions and highlights resolutions with odd (i.e. non-even) numbers
- if the files are of different resolution, a new 'ratio' comparison statement will now show if either have a nice ratio, with current list 1:1, 4:3, 5:4, 16:9, 21:9, 2.35:1
- added a 'stop filtering' button to the duplicate hover frame
- made the ill-fitting 'X' button on top hover frame a stop button and cleaned up some misc related ui layout
- added a 'remove this file's potential pairs' command to the thumbnail file relationships menu
- if in advanced mode, multiple thumbnail selection right-click menus' file relationships submenus will now offer mass remove/reset commands for the whole selection. available commands are: 'reset search', 'remove potentials', 'dissolve dupe groups', 'dissolve alt groups', 'remove false positives'
- .
- the rest:
- added link to
https://gitgud.io/koto/hydrus-dd/ , a neat neural net tagging library that uses the DeepDanbooru model and has several ways of talking to hydrus, to the client api help
- cleaned up a little of the ipfs file download code, mostly improving error/cancel states
[Expand Post]
- rewrote some ancient file repository file download code, which ipfs was also using when commanded to download via a remote thumbnail middle-click. this code and its related popup is now cleaner, cancellable, and session-based rather than saving download records to the db (which caused a couple of edge-case annoyances for certain clients). I think it will need a bit more work, but it is much saner than it was previously
- if you do not have the manage tags dialog set to add parents when you add tags, the autocomplete input will no longer expand parents in its results list
- fixed an issue displaying the 'select a downloader' list when two GUGs have the same name
- hitting apply on the manage parsers or url classes dialogs will now automatically do a 'try to link' action as under manage url class links
- fixed (I think!) how the server services start, which was broken for some users in 361. furthermore, errors during initial service creation will now cancel the boot with a nice message, and the 'running … ctrl+c' message will appear strictly after the services have started ok the first time, and services will shut down completely before the db is asked to stop
- improved how the program recognises shutdowns right after boot errors, which should speed up clean shutdowns after certain bad server starts
- the server will use an existing server.crt and server.key pair if they exist on db creation, and complain nicely if only one is present
- the 'ensure file out of the similar files system' file maintenance job result will now automatically remove from/dissolve the file's duplicate group, if any, and clear out outstanding potential pairs
- a system language path translation error that was occuring in some unusual filesystems when checking for free disk space before big jobs is now handled better
- like repository processing, there is now a 1 hour hard limit on any individual import folder run
- fixed an issue where if a gallery url fetch produced faulty urls, it could sometimes invalidate the whole page with an error rather than just the bad file url items
- subscriptions will now stop a gallery-page-results-urls-add action early if that one page produces 100 previously seen before urls in a row. this _should_ fix the issue users were seeing with pixiv artist subs resyncing with much older urls that had previously been compacted out of the sub's cache
- until we can get better asynch ui feedback for admin-level repository commands (lke fetching/setting account types), they now override bandwidth rules and only try the connection once for quicker responses
- misc code cleanup
next week
I was unable to get to the jobs I wanted to this week, so I think I'll go for a repeat: updating the system:hash and system:similar_to predicates to take multiple files and extending the Client API to do cookie import for easier login. And I'll play around with some audio stuff.