https://www.youtube.com/watch?v=gTFG_nvreoI
windows
zip:
https://github.com/hydrusnetwork/hydrus/releases/download/v378/Hydrus.Network.378.-.Windows.-.Extract.only.zip
exe:
https://github.com/hydrusnetwork/hydrus/releases/download/v378/Hydrus.Network.378.-.Windows.-.Installer.exe
macOS
app:
https://github.com/hydrusnetwork/hydrus/releases/download/v378/Hydrus.Network.378.-.macOS.-.App.dmg
linux
tar.gz:
https://github.com/hydrusnetwork/hydrus/releases/download/v378/Hydrus.Network.378.-.Linux.-.Executable.tar.gz
source
tar.gz:
https://github.com/hydrusnetwork/hydrus/archive/v378.tar.gz
I had a great, simple week. Searches are less likely to be very slow, and system:limit searches now sort.
all misc this week
I identified a database access routine that was sometimes not taking an optimal route. Normally it was fine, but with certain sizes or types of query, it could take a very long time to complete. This mostly affected multi-predicate searches that included certain tags or system:duration and system:known urls, but the routine was used in about 60 different places across the program, including tag and duplicate files processing. I have rewritten this access routine to work in a more 'flat' way that will ensure it is not so 'spiky'.
Also in searching, I managed to push all the 'simple' file sorts down to file searches that have 'system:limit'. If you search with system:limit=256 and are sorting by 'largest files first', you will now see the 256 largest files in the search! Previously, it would give a random sample. All the simple sorts are supported: import time, filesize, duration, width, height, resolution ratio, media views, media viewtime, num pixels, approx bitrate, and modified time. If you want something fun, do a search for just 'system:limit=64' (and maybe system:filetype) and try some different sorts with F5โyou can now see the oldest, smallest, longest, widest, whateverest files in your collection much easier.
There are also some fixes: if you had sessions not appearing in the 'pages' menu, they should be back; if you have had trouble with ipfs directory downloads, I think I have the file-selection UI working again; 'remove files when trashed' should work more reliably in downloader pages; and several tag and selection lists should size themselves a bit better.
full list
- if a search has system:limit, the current sort is now sent down to the database. if the sort is simple, results are now sorted before system:limit is applied, meaning you will now get the largest/longest/whateverest sample of the search! supported sorts are: import time, filesize, duration, width, height, resolution ratio, media views, media viewtime, num pixels, approx bitrate, and modified time. this does not apply to searches in the 'all known files' file domain.
- after identifying a sometimes-unoptimal db access routine, wrote a new more reliable one and replaced the 60-odd places it is used in both client and server. a variety of functions will now have less 'spiky' job time, including certain combinations of regular tag and system search predicates. some jobs will have slightly higher average job time, some will be much faster in all common situations
- added additional database analysis to some complicated duplicate file system jobs that adds some overhead but should reduce extreme spikes in job time for very large databases
- converted some legacy db code to new access methods
- fixed a bug in the new menu generation code that was not showing sessions in the 'pages' menu if there were no backups for these sessions (i.e. they have only been saved once, or are old enough to have been last saved before the backup system was added)
- fixed the 'click window close button should back out, not choose the red no button' bug in the yes/no confirmation dialogs for analyze, vacuum, clear orphan, and gallery log button url import
- fixed some checkbox select and data retrieval logic in the checkbox tree control and completely cleared out the buggy ipfs directory download workflow. I apologise for the delay
- fixed some inelegant multihash->urls resolution in the ipfs service code that would often mean a large folder would lock the client while parsing was proceeding
- when the multihash->urls resolution is going on, the popup now exposes the underlying network control. cancelling the whole job mid-parse/download is now also quicker and prettier
- when a 'downloader multiple urls' popup is working, it will publish its ongoing presented files to a files button as it works, rather than just once the job is finished
- improved some unusual taglist height calculations that were turning up
- improved how taglists set their minimum heightโthe 'selection tags' list should now always have at least 15 rows, even when bunched up in a tall gallery panel
- if the system clock is rewound, new objects that are saved in the backup system (atm, gui sessions) will now detect that existing backups are from the future and increase their save time to ensure they count as the newest object
- short version: 'remove files from view when trashed' now works on downloader thumbs that are loaded in from a session. long version: downloader thumb pages now force 'my files' file domain for now (previously it was 'all local files')
- the downloader/thread watcher right-click menus for 'show all downloaders xxx files' now has a new 'all files and trash' entry. this will show absolutely everything still in your db, for quick access to accidental deletes
- the 'select a downloader' list dialog _should_ size itself better, with no double scrollbars, when there are many many downloaders and/or very long-named downloaders. if this layout works, I'll replicated it in other areas
- if an unrenderable key enters a shortcut, the shortcut will now display an 'unknown key: blah' statement instead of throwing an error. this affected both the manage shortcuts dialog and the media viewer(!)
- SIGTERM is now caught in non-windows systems and will initiate a fast forced shutdown
- unified and played with some border styles around the program
- added a user-written guide to updating to the 'getting started - installing' help page
- misc small code cleanup
next week
I am going to take a few days off for the holiday and make the next release in two weeks, for New Year's Day. I expect to do some small jobs, push more on the database optimisation, continue improving the UI layout code, and perhaps put some time into some space-clearing database maintenance.
[Expand Post]
๐ธ๐๐๐๐ ๐ฎ๐๐๐๐๐๐๐๐!