/hydrus/ - Hydrus Network

Archive for bug reports, feature requests, and other discussion for the hydrus network.

Index Catalog Archive Bottom Refresh
Name
Options
Subject
Message

Max message length: 12000

files

Max file size: 32.00 MB

Total max file size: 50.00 MB

Max files: 5

Supported file types: GIF, JPG, PNG, WebM, OGG, and more

E-mail
Password

(used to delete files and posts)

Misc

Remember to follow the Rules

The backup domains are located at 8chan.se and 8chan.cc. TOR access can be found here, or you can access the TOR portal from the clearnet at Redchannit 3.0.

US Election Thread

8chan.moe is a hobby project with no affiliation whatsoever to the administration of any other "8chan" site, past or present.

(18.77 KB 480x360 jm2vqlZJ4b8.jpg)

Version 342 hydrus_dev 03/07/2019 (Thu) 00:05:14 Id: 47bce6 No. 11805
https://www.youtube.com/watch?v=jm2vqlZJ4b8 windows zip: https://github.com/hydrusnetwork/hydrus/releases/download/v342/Hydrus.Network.342.-.Windows.-.Extract.only.zip exe: https://github.com/hydrusnetwork/hydrus/releases/download/v342/Hydrus.Network.342.-.Windows.-.Installer.exe os x app: https://github.com/hydrusnetwork/hydrus/releases/download/v342/Hydrus.Network.342.-.OS.X.-.App.dmg linux tar.gz: https://github.com/hydrusnetwork/hydrus/releases/download/v342/Hydrus.Network.342.-.Linux.-.Executable.tar.gz source tar.gz: https://github.com/hydrusnetwork/hydrus/archive/v342.tar.gz I had a good week. There are several new ui features and bug fixes, and webp and tiff now have basic support. tiff and webp Tiff files are now supported! Tiff is an old format typically used for photographic and scientific uses (think 12,000x12,000px images of the moon). They'll import to hydrus like any other image and will work on the duplicate file system. I tested 24bit and 8bit (monochrome), but could not find a 48bit example–even the easily accessible NASA stuff was all 24bit–so I don't know if high colour-depth tiffs work. If you have an example of one, please send it to me, and if it doesn't work in hydrus, I'll see if I can figure it out. EDIT: Some tiffs are not supported. I know basically what this is and will fix it for next week. And webp files are now supported! Webp is a newer initiative from Google to replace jpgs and pngs. It has some impressive file size reductions, but hasn't picked up all that much yet. They'll now import to hydrus like any other image and will work on the duplicate file system. WebP also supports image-animations, much like gif, but I haven't written support for this yet. For now, hydrus will assume all webps are static images, and animations will only give you the first frame (just like how apngs used to import when hydrus considered them pngs). I did not have to do all that much work to get tiff and webp working–I just hooked in my existing image libraries, PIL and OpenCV. If you run from source and have slightly older libraries here, your support may not be complete. I hope to keep working here and try to get animated webps working. If you are enthusiastic about webp, let me know how you get on–they mite b cool for some purposes, and if the library support for saving them is there, I might start using them in hydrus's thumbnail system, including for the long-planned animated thumbnail system. otherwise all misc this week Tag autocomplete search should be a bit quicker to update when you temporarily pause on a large search (e.g. if you pause on 'title:' and then type 'gar', the big laggy search for 'title:*' will now be cancelled much faster so the quicker 'title:gar*' can replace it). Every one of the complicated 'copy' URL thumbnail menu entries is now replicated on the 'open' menu! So, if you select five files that all have twitter URLs, you can hit right-click->known urls->open->these files' twitter tweets to open them all in one go! It will open them one at a time, one per second, so as not to nuke your browser. If asked to open more than a few URLs like this, the client will warn you with a yes/no dialog and throw up a popup that will let you cancel the operation (just in case you accidentally try to open 500 URLs!). By default, 'system:everything' is now hidden in search contexts with more than 10,000 files! This search predicate is sometimes useful to new users, but experienced users hit it more by accident than not. If you would rather have it anyway, you can force it to appear by hitting the new checkbox under options->default system predicates. Gallery download pages now have a more informative 'status' column–they'll report gallery fetch status and say 'done!' when all work is complete. After many attempts and some great help from users, I believe I have finally fixed the Linux 'similar files search with >0 distance' crash. One user and I managed to isolate the program area and figure out a cleaner layout, and it seems to be fixed. If you have had this problem, please give it a hesitant go and let me know what happens. Also, I cleaned up some video rendering logic and memory management. Videos should render a little smoother and more responsively, and they will clean up after themselves better as well. This may have fixed a long-running but rare all-platforms crash related to launching or closing the media viewer on a playing video. full list - added support for webp import. it does not yet support animated webps, which, if the local platform supports, will import like apngs used to: just the first frame - added support for tiff import. it works ok for 24bit and 8bit (monochrome) tiffs, but I am not sure how well it will do with 48bit - both webp and tiff should work on the duplicate files system - improved webm detection to include opus audio (previously, these files were falling back to mkv) - fixed an issue where unusual formats with duration but no frames or frames but no duration were being sorted and otherwise presented incorrectly - improved autocomplete job cancelability. this job can now cancel much faster on large jobs, meaning typing searches with large result sets will hit less CPU and return faster on subsequent keystrokes - _all_ of the complicated 'copy url' commands from the thumbnail right-click->known urls menu are now available on the 'open' submenu! if there is more than one url to open (e.g. 'open all of these files blahbooru post urls' on a selection of 50 files), you will be presented with a yes/no dialog to confirm, and it will open one url in your browser every second (with a cancellable popup if num_urls > 5) - by default, system:everything is now hidden if its total files is >10k. you can force it to always show under options->default system predicates - the gallery downloader's list's status column now shows gallery status (deferring to active file status) when appropriate and shows 'done!' when all work is complete - after working back and forth with a user, I _believe_ the linux similar files >0 distance search crash is finally fixed - fixed sorting by media views/viewtiming with collections
[Expand Post]- a single-selected collection right-click now shows total media views for all files in the collection! you can now see how long you have been viewing an artist! - fixed an issue that lead to export folders not running on always-on clients as often as they should - updated the gelbooru 0.2.5 file page parser to pull rating tag from the correct location (previously, it was pulling from what appears to be a site-wide 'mature' browser hint) - improved memory cleanup stability when animations and other parts of the video rendering pipeline are deleted–this _may_ fix some rare crashes - increased animation rendering aggression overall and particularly in 'future' of frame buffer - if a video renderer that is asked to start some way into the video fails to render anything, it will now fall back to trying to render from the beginning. this is slightly hacky atm and leads to out of phase rendering frames, but it is better than an error - added a '–no_db_temp_files' launch parameter that will force the client or server to return to the recent old behaviour of exclusively using memory for journalling. this is useful if your temp directory is small and/or your available ram is very large. if running in this mode, the client will attempt to check available memory (instead of free space on your temp dir) before performing very large transactions - with the new lighter-weight update transactions, the client now tests for less free space for journalling before running repository update processing - added /get_files/search_files to the client api, which does the first half of file searching. it allows tag search (including -tag negation) and system inbox/archive. since the second half, which will fetch file metadata, is not yet in, this can't do anything interesting yet - updated help and unit tests to support this, client api version is now 3 - some misc refactoring next week I got the first half of Client API file search done this week, but it is useless without the other half, so that's the top priority next week. Also, after thinking about it a bit and talking with some users, I am going to try changing my schedule just a little. I always enjoy stability and cleanup work, and I am happy I found time recently to do some more, but it is often difficult to remember this kind of work with the pressure of a hundred smaller (and often sexier) jobs. With this in mind, I will try having a rotating schedule where I do one week focusing on stability and cleanup (including improving unit tests), one on larger ongoing work (like the recent autocomplete async work and the thumbnail experiment last week), and two on regular smaller jobs (like most of what I did this week). I'll go cleanup, small, ongoing, small. I will still work on bugs and the 'big job' as normal every week, but the rest of my time will be more focused on one thing. This will force me to spend at least some time on each important area every month without turning my week inside out trying to cram a bit of everything in. Anyway, I'm going to give that a go and see how it feels. I will do cleanup next week. There are a ton of broken unit tests that need catching up to current code, and I'd love to move forward the listctrl object replacement. Both of these are boring as anything and have been put on the back burner, so let's see if I can concentrate next week and move them forward!
Thanks for the update. After seeing the "–no_db_temp_files" addition, I'm curious about what kind of data Hydrus stores in your temporary directory. What you use Hydrus on a encrypted drive, will this leak any potentially sensitive data?
(11.37 KB 793x74 1552054554.png)

Hydrus 340+ (haven't tested earlier versions) is prone to locking up on my machine when processing repository updates. My machine is Arch Linux. When it locks up, the GUI is completely unresponsive. Pic related, the process doesn't seem to be even doing anything and the memory usage is no where near high enough for the lock up to be caused by cache thrashing, i.e. constantly paging stuff in and out. What would be causing this? Is this a normal thing that happens in Hydrus that will eventually solve itself if I leave it long enough? I'm going to leave it overnight and report back results.
How are non-listed file formats handled? Like .avi, .flv, etc? Can they still import? Just no thumbnail? >if its not obvious im AFK and havent yet installed and played with hydrus but am considering it for a large database of video / memes.
>>11815 Ok so I left it on overnight and it did gain progress while the GUI was locked up. Also I noticed that the CPU usage did sometimes jump up to ~90% so it was at least doing something.
>>11816 If something is not supported by hydrus it's ignored with "unknown mime type" error. Speaking of this: Would be really cool to actualy be able to import any kind of unsupported file into hydrus. Shouldn't be that hard to implement? If you need to open it just call standard system handler for it.
When I try to use a download tab to download this dA gallery url: https://www.deviantart.com/ziemospendric/gallery/25806230/Bubblegum-Crisis it starts downloading the furry/alien stuff from the artist's main page. Are galleries not supported?
>>11820 These are sub-galleries, you might want a parser for that.
>>11821 I haven't had time to look into parsers yet, since my Hydrus stability is extremely bad right now. I've been trying to wrap my head around all the issues that appeared after the python 3 upgrade. Well, this might be a good time to start writing these down. I'm running Kubuntu 18.04, with the provided Hydrus binary version, since I didn't have time to get the new source version working yet. I'm the guy who has that ffmpeg problem. Here is the previous discussion about the issue: >>11554 >>11557 >>11579 >>11594 >>11595 I tried adding the ffmpeg binary to the hydrus bin folder as suggested, which didn't fix the problem. When I select: "Help -> debug -> report modes -> subprocess report mode", as was suggested, nothing happens. No window opens, I see no errors, and nothing gets shown in the terminal. The terminal gets more serious error spam than I ever had in python 2 versions. The worst offender is "GdkPixbuf-CRITICAL", which produces walls of text with errors for tons of different functions: >(client:11467): GdkPixbuf-CRITICAL **: gdk_pixbuf_get_pixels_with_length: assertion 'GDK_IS_PIXBUF (pixbuf)' failed >(client:11467): GdkPixbuf-CRITICAL **: gdk_pixbuf_get_rowstride: assertion 'GDK_IS_PIXBUF (pixbuf)' failed >(client:11467): GdkPixbuf-CRITICAL **: gdk_pixbuf_get_n_channels: assertion 'GDK_IS_PIXBUF (pixbuf)' failed >(client:11467): GdkPixbuf-CRITICAL **: gdk_pixbuf_get_pixels_with_length: assertion 'GDK_IS_PIXBUF (pixbuf)' failed >(client:11467): GdkPixbuf-CRITICAL **: gdk_pixbuf_get_rowstride: assertion 'GDK_IS_PIXBUF (pixbuf)' failed >(client:11467): GdkPixbuf-CRITICAL **: gdk_pixbuf_get_n_channels: assertion 'GDK_IS_PIXBUF (pixbuf)' failed >(client:11467): GdkPixbuf-CRITICAL **: gdk_pixbuf_get_pixels_with_length: assertion 'GDK_IS_PIXBUF (pixbuf)' failed >(client:11467): GdkPixbuf-CRITICAL **: gdk_pixbuf_get_rowstride: assertion 'GDK_IS_PIXBUF (pixbuf)' failed >(client:11467): GdkPixbuf-CRITICAL **: gdk_pixbuf_get_n_channels: assertion 'GDK_IS_PIXBUF (pixbuf)' failed >(client:11467): GdkPixbuf-CRITICAL **: gdk_pixbuf_get_pixels_with_length: assertion 'GDK_IS_PIXBUF (pixbuf)' failed >(client:11467): GdkPixbuf-CRITICAL **: gdk_pixbuf_get_rowstride: assertion 'GDK_IS_PIXBUF (pixbuf)' failed >(client:11467): GdkPixbuf-CRITICAL **: gdk_pixbuf_get_width: assertion 'GDK_IS_PIXBUF (pixbuf)' failed >(client:11467): GdkPixbuf-CRITICAL **: gdk_pixbuf_get_height: assertion 'GDK_IS_PIXBUF (pixbuf)' failed (This keeps going for thousands of lines.) Inbetween I get errors for the used theme: > (client:11467): WARNING : Invalid borders specified for theme pixmap: > /usr/share/themes/Breeze-Dark/gtk-2.0/../assets/button.png, >borders don't fit within the image This might be related to my next problem. Ever since the python 3 update, buttons, tabs and similar graphical elements are not displayed anymore. I only know they're there because the button/tab text is still displayed. See the attached screenshot. I'm not sure if this is a problem specific to KDE Plasma, but it used to look fine with the old python 2 versions.
>>11807 That is an important question, and one I had not considered before. Yes, some data will leak if you have a non-encrypted system drive but an encrypted hydrus drive. Hydrus uses your system's temp directory for several jobs explictly: Some thumbnails for default mimetypes, like 'pdf icon' and 'music icon'. Drag and drop to discord with the discord bugfix mode on. Copying temp instance of the file for file import (both client and server) Serialised PNG export And SQLite uses (absent that command parameter from now on) the temp dir for all temporary transaction data that becomes too large to store in memory (I don't know what this limit is, strictly, and I suspect it differs based on platform/version/etc.. I imagine it is about 1-10MB). For hydrus, this mostly means vacuums and processing updates. The temp files are all deleted as soon as they are no longer needed (and many of the smaller files, I understand, are never actually written to disk since they disappear before the OS flushes its write buffers), but this is still a potentially significant leak against a determined attacker. If you are going to store hydrus on an encrypted drive and are afraid of your meme-hating government scanning your unused system hard drive sectors for old temp data, I strongly recommend you use encryption on your system drive as well–even if a user or I were to figure out a way to launch hydrus with an environment that pointed to an encrypted temp dir, the OS will still spool secret info in memory to your pagefile or any of a bunch of other unencrypted system-side file index caches and so on. If you want to be protected, go the whole hog. A possible mitigation might be to regularly wipe your unused data. I don't know enough about the subject–maybe your encryption software can do that for you, but tbh the easier solution (that will also cover OS caches) is likely just to encrypt your system as well.
>>11818 >>11815 Thank you for this report. Is this during 'idle' time or during shutdown? If it is in idle, can you go to options->maintenance and processing and try changing it so it only processes during shutdown, and only for, say, 5 minutes? Do a test run, and see how long it runs for and how long it takes to do the final 'commit' step. The new repo processing routine takes breaks to commit work, so if you are on a (maybe fragged?) HDD rather than an SSD, commit time may be inflating the work time. Or something else may be going on. If anything, it should be faster to return control to you when 'idle' state is no longer true precisely due to its checkpointing. If shutdown work cancels basically as expected at 5 mins, how about 15 mins?
>>11816 Many are supported. It depends more on what the file actually is rather than its extension–hydrus ignores extensions and runs files through a bunch of file header tests to try to figure out what they really are. So if your avi is a supported mpeg, it'll work. Flv should be supported right now. Just try to import to hydrus and see if it likes it. At the moment, anything not supported will fail to import. If you discover a reasonable-looking video that is not supported, please send it to me, or a link to it, and I'll see if I can figure it out. Mostly it is just a case of my running it through FFMPEG and figuring out what to parse. >>11819 Yeah, I think I am coming around to this idea. The basics of just saying 'yeah, unknown is allowed' and copying bytes over is no problem, but it changes some fundamental ideas about the program and will need a bunch of error handling changes. Atm, hydrus knows that it can figure out the mime for any file under its control, but if there is a nebulous 'application/octet-stream' or whatever added, this isn't quite true any more. If a PSD goes in and gets renamed to '.bin', I can't atm recover that it was a .psd if the user wants to export it again. Or, say, launch it with an external program that expects .psd. Atm this is a longer term 'ongoing' job. I hope to be able to get at this sort of larger 'it would be nice to have that' job in my new 'ongoing' work weeks.
>>11820 >>11821 >>11823 Thanks for this DA example. Yeah, I think hydrus has no match for the sub-URL, so it is going up a step to everything. I am going to be looking at DA next week for a high-res file trick, so I'll check this out as well. It might be as simple as just adding a new URL class.
>>11823 Hey, I am sorry you are having a bad time here. Some Window managers do this 'invisible lines' stuff and I don't know why. I expect the CRITICAL spam is related. My best guess is that my Ubuntu 16.04 GTK2 .so files are conflicting with your system's equivalents and it is sperging out. It is odd your subprocess_report_mode isn't doing anything. It is supposed, at the very least to do some 'attempting to do (this)' stuff before the real fun begins. Perhaps your ui problems are enough that those messages are not displaying. The best solution right now–which isn't guaranteed 100% but almost always makes things better–is to run from source. Linux just isn't too happy running builds from other flavours. This would likely also fix your ffmpeg location problem, which is due to PyInstaller's unusual launch environment. Please keep me updated, and if you decide to try running from source, let me know if you run into trouble.
>>11832 Thank you for this detailed overview. I realize there is no way to be completely safe with page files and all, which you have no control over, but as far as you can, please keep any temporary data to the install or db folders. Thank you.
>TIF support Hey, we use that for medical xrays at work. Not like I'll bring Hydrus there, but still, neat. Those aren't exactly 48 bit though.
>>11833 >Is this during 'idle' time or during shutdown? It is when explicitly processing via review services. I haven't yet had it process during idle or shutdown. Also, I am on a HDD, but it's ext4 so fragmentation isn't an issue. And yes, it seems to be just committing for most of the time, it can be in the commit state for quite a long time. I'll post later with the shutdown processing timings as I'll need to wait an hour for the client to ask to do jobs on shutdown, as that seems to be the minimum wait time.
>>11840 Ok got the timing. The job stopped at the expected 5 minutes, and the commit took about a few seconds.
(268.01 KB 1333x715 3634654.jpg)

>>11805 A bit late but I tried the nijie.info downloader. Its saying I had 34 items but only displaying 18 images(the right number of works from the artist page). I took a look at the logs and it looks like its counting 1 url as 2, so I guess that's kind of normal? Also something weird and not sure if this is related but its also saying 2 of the images were already in my database but I don't remember adding either of them, they're not archived either. Looking at the date and its the same as the rest of the galley I just downloaded so I'm not sure what's going on.
>>11836 Thanks. I will let you know how it works when I run it from source again. Might take a week or two until I have the nerves and time for that. >>11805 One quick note about webp: One reason it has not caught on is, that it only supports 4:2:0 chroma and is fixed to using the old VP8 codec. Webm already has 2 never generations of video codecs (VP9 and AV1) and supports full 4:4:4 chroma, which all other still image formats support. Webp is not being updated to support this for some reason. So it's good that it's supported, but if anybody reading this still wants to be able to look at a pic that's in webp format in 15 years, I would recommend searching for a png or jpg version of it now.
Hey Dev, I'm a fairly new user (discovered hydrus about a month ago) and fellow programmerfag. props on all the work you've put into this amazing program. I'm currently working on a mobile companion app for hydrus. It will utilize the client API and when it's done I'll release it for anyone to use. I'm wondering what your plans are for expanding the client API. A big thing I'd want before I'd consider releasing is a way to search local tags. Is this something you're considering adding? Seems like all you have is a way to clean given tags or add tags to post but no way to know what tags already exist. Sorry if this update thread isn't the appropriate place to ask.
(319.37 KB 800x600 IMG_20190311_194047_671.jpg)

I can no longer import any downloaders. If I drag, If I load with file windows, I get the same message. ValueError too many values to unpack (expected 2) Traceback (most recent call last): File "include\ClientSerialisable.py", line 270, in LoadFromPng ( height, width ) = numpy_image.shape ValueError: too many values to unpack (expected 2)
>>11848 Is there a way to use the pre-included pixiv downloader to download bookmarks only from a profile?
>>11848 Hey, I am sorry, I cannot reproduce this. Is there any chance the png you are trying to import was a resized version? Can you do me a favour and try to drop onto lain one of the files in install_dir/static/default/parsers? Does one of those files work ok? If one of those files works ok, can you post the png you were trying to import here? Also, what is your apparent version of OpenCV under help->about? Is it 4.0.0, or something else? Regardless, I will improve the error here. It should say something like "This PNG seems to be (RGB), whereas I was expecting a monochrome one–could it have been a resized/processed version?" As an example, the attached file works for me, as do the smaller files in the static/default dir.
>>11837 Thanks. I'll make a job to add a temp folder override parameter, so you can set it to wherever you like. I am pretty sure I can force this at the environment level and catch SQLite as well, but I'll have to test it with pyinstaller's extra layer of stuff here.
>>11849 I don't think so–afaik, the artist downloader pulls everything they made. Is a bookmark like artist favourites? Maybe the tag downloader can be mangled with a kind of 'artist_bookmarks:123456' kind of tag? I don't know anything about advanced pixiv searching though. Otherwise it probably needs a new parser and gug to grab that info, wherever it is.
>>11838 A user helpfully pointed me to some here: http://www.brucelindbloom.com/index.html?RGB16Million.html They render ok, so I guess a clever sRGB conversion is happening. I also added support for 'MM' tiffs for tomorrow. Should be complete tiff support.
>>11840 >>11841 If you have a lot of processing still to do, I recommend you do not do explicit processing from that review services frame. It can only be cancelled through the UI, and after a long time, there is a decent chance the UI will deadlock until the job is done (this is due to some shit code by me that will take a bit to clear up). If the job still has 15 hours of work left, the whole program can hang that long. I recommend you only let processing happen during shutdown, where it has a timer, and idle time, where moving the mouse will tell it to safely cancel. That review services button is hidden behind advanced mode and is only ever really a pleasant experience when I am doing some testing or when on a fast SSD without much total processing left to do.
>>11842 Yeah, for some complicated websites, an import 'item' produces more than one file or a new page to pursue. It is a technical pain in the ass, but for now, the x/y progress on an importer refers to 'import items processed' not 'potential files imported'. If you would like to check those 'already in db' files, you should be able to right-click on them and say something like 'open these in a new page'. Since in that note it says '10 seconds ago', they are almost certainly duplicates from above. (I don't know anything about nijie.info, but yeah, it looks like the _0 url without diff/main is a 'cover image' for the subsequent mini-gallery?) Again, some galleries give the same file twice, even on their nice APIs. I don't know why this parser pulls that 'cover', but I did not write it, so I can't confidently say this strategy isn't needed to catch edge cases. The nuts and bolts of this stuff are frequently fuckery duckery doo, particularly on Japanese sites.
>>11844 Thanks, that is interesting. I assume the lossless compression mode is functionally 4:4:4, right? But any lossy encoding is coerced to 4:2:0? My hesitance for webp in the past is that it still is sRGB, so I don't know how it is useful as we move to HDR in the coming years. Maybe they will do like webm and add new accepted encoders, but I dunno. It doesn't seem to be taking the world of internet browsers and phones by storm as it is. HEIF and FLIF seem to have their good points, but they are still similarly meme status for now. I'll play with animated webps a bit when I find time, as I'd really prefer not to use shitty gifs for animated thumbs, and I don't want to go completely nuts with apngs either.
>>11846 Thanks, I am glad you like it. When you get an app ready and want to share it about, please send me a link to it and I'll put it up in my release posts and help! Current plan for Client API is to get a simple 1.0 out the door. This should be done tomorrow for 243, where basic file search and file/thumbnail retrieval are finished. You'll be able to find all the files in a client with the tags 'blue eyes' and 'blonde hair' and then shows thumbs and files and tags. It should be possible to replicate very basic booru functionality. After the release tomorrow, please check the updated help here: https://hydrusnetwork.github.io/hydrus/help/client_api.html Which will have the last '/get_files/…' queries all filled out. With the 1.0 done, Client API work will then be fit into the regular weekly 'small jobs' schedule. If someone wants the ability to archive files or search by duration, I'll see if I can squeeze it into a small job. If you have ideas for your app, please submit them any way that is convenient–these release threads are fine, and you can also email me or even DM me on discord on Saturday afternoons US time if you want to talk live.


Forms
Delete
Report
Quick Reply