Notice: My personal stance on AI generated artwork. Retweet and share if you agree. Let us discuss, and not immediately scream bloody murder.

Now Viewing: Page limit
Keep it civil, do not flame or bait other users. If you notice anything illegal or inappropriate being discussed, contact an administrator or moderator.

deltaslayer - Group: Member - Total Posts: 8
user_avatar
Page limit
Posted on: 02/26/18 11:43PM

Hi,

I currently understand the purpose of limiting paging navigation to 1000 (I believe it's a standard in favor of SEO or something?). However, can anyone advise about the purpose of the limitation via the API?

The purpose of going for an API was solely to allow browsing over this limit, i.e say tag pokemon, even if, filtered by two tags, may result in an amount of images crossing the 1000 page limit. I tried browsing past over 1000 without any tags, it lets me reach up to 1212 so far, but if I type 1500 I have an error "Too deep! Pull it back some. Holy fuck.". Is it because of the performance/cost over the database? Are there any means to work around as in, perform such kind of query without being consuming on the database/server?

In my experience, I've observed other APIs having such limitations, however, I've worked on laravel and notice there's no such limitation and the paging seems to work fine even on a very huge list.



deltaslayer - Group: Member - Total Posts: 8
user_avatar
Posted on: 02/26/18 11:45PM

Also, just a thought, shouldn't that error cause a 500 or 404 response rather than 200?



lozertuser - Group: The Fake Administrator - Total Posts: 2230
user_avatar
Posted on: 02/27/18 01:07AM

Please read through the FAQ. It is CPU load issues with large offsets.



deltaslayer - Group: Member - Total Posts: 8
user_avatar
Posted on: 02/27/18 01:51AM

Sorry I had to google to spot where the FAQ sits on the site. So after reading around, I understand the offset seems to be a problem to SEO, but is it still an issue as well to the API? If so, I read a workaround I am willing to try. It was said that you can request posts from a range/set of ids at gelbooru.com/index.php?page=forum&s=view&id=1549 but I am not sure how to achieve this with the API as the argument mentioned on the API page seems to accept only 1 id. Is there a way to use the API and return a list of post i.e from id 1 to 10?



Jerl - Group: The Real Administrator - Total Posts: 6706
user_avatar
Posted on: 02/27/18 02:06AM

Exactly the same way as you do in a normal search. Metatags are tags and work just fine whether you're searching using the main listing or the API.

The API request for all posts with ID's less than 3,000,000 should look like this:

/index.php?page=dapi&s=post&q=index&tags=id:<3000000

The request for just posts with ID's from 1 to 10 would be this:

/index.php?page=dapi&s=post&q=index&tags=id:<11



deltaslayer - Group: Member - Total Posts: 8
user_avatar
Posted on: 02/27/18 02:29AM

I am not an experienced user of the site, really sorry for missing on the above and thanks a lot for hinting me in the right direction. I've tested the above and I think I can try to work around it however, this causes another issue in the sense that, the API (or the regular search) is limited to two tags per search ... so if I use the above, means I am now down to/allowed of using 1 tag while searching, is that correct? Though I observe I can blacklist tags as much as I want? (tried blacklisting 6 tags)



Jerl - Group: The Real Administrator - Total Posts: 6706
user_avatar
Posted on: 02/27/18 02:45AM

The API and regular search is not limited to two tags per search. There is a limit, but it's based on the total size of the HTTP request header and so large that you have to try really hard to even find posts that have enough tags for it to work on, and even then it'll have narrowed it down to just one post well before you get anywhere near the limit. You can, for all intents and purposes, search for as many tags at once as you want.

gelbooru.com/index.php?page=post&s=list&tags=rating:safe+blonde_hair+long_hair+blue_eyes+2girls+alcohol+-beer+-comic

gelbooru.com/index.php?page=post&s=list&tags=00s%2010s%201boy%202016%206%2Bgirls%2090s%20absolutely_everyone%20absurdres%20alice_(kyokugen_dasshutsu)%20amalia_sheran_sharm%20areolae%20ass%20asui_tsuyu%20ayanami_rei%20ayasato_mayoi%20barefoot%20bathhouse%20bathing%20batman_(series)%20black_hair%20blonde_hair%20blue_eyes%20blue_hair%20blush%20boku_no_hero_academia%20breast_envy%20breasts%20broom%20brown_eyes%20brown_hair%20bulma%20capcom%20cat%20crossover%20dango%20dark_skin%20dc_comics%20dofus%20dragon_ball%20drinking%20eating%20elena%20elena_(street_fighter)%20elf%20empowered%20empowered_(series)%20eruka_frog%20eyeshield_21%20feet%20final_fantasy%20final_fantasy_vii%20flat_chest%20food%20fushigi_no_umi_no_nadia%20fuu%20genderswap%20genshiken%20ghost%20green_eyes%20green_hair%20guilty_gear%20guilty_gear_xrd%20gyakuten_saiban%20hanging_breasts%20harley_quinn%20haruno_sakura%20head_wings%20hex_maniac_(pokemon)%20highres%20horns%20hunter_x_hunter%20kasugano_sakura%20kermit_the_frog%20kill_la_kill%20konno_(genshiken)%20kyokugen_dasshutsu%20kyokugen_dasshutsu%3A_9_jikan_9_nin_9_no_tobira%20kyokugen_dasshutsu_adv%3A_zennin_shibou_desu%20large_breasts%20lilith_aensland%20long_hair%20lunch_(dragon_ball)%20maliki%20maliki_(character)%20mankanshoku_mako%20mario_(series)%20marvel_vs._capcom%20marvel_vs._capcom_2%20medusa_gorgon%20mixed_bathing%20multicolored_hair%20multiple_crossover%20multiple_girls%20muppets%20nadia%20naruto%20navel%20neon_genesis_evangelion%20nichijou%20ninjette%20nintendo%20nipples%20noise_tanker%20nude%20one-punch_man%20one_piece%20onsen%20orange_hair%20osamodas%20pakunoda%20panty_%26_stocking_with_garterbelt%20partially_submerged%20perona%20pink_hair%20pointy_ears%20pokemon%20pokemon_(game)%20pokemon_xy%20ponytail%20princess_peach%20pubic_hair%20purple_hair%20pussy%20ramlethal_valentine%20ranma-chan%20ranma_1%2F2%20reading%20red_eyes%20sakamoto_(nichijou)%20samurai_champloo%20sanshoku_dango%20saotome_ranma%20sekiguchi_yuria%20shinonome_nano%20short_hair%20signature%20simone_(dofus)%20small_breasts%20sonson%20soryu_asuka_langley%20soul_eater%20splashing%20spunch_comics%20spunchette%20steam%20stocking_(psg)%20street_fighter%20street_fighter_iii%20street_fighter_iii_(series)%20super_mario_bros.%20tail%20taki_suzuna%20tatsumaki%20tattoo%20tendou_nabiki%20the_disaster_artist%20tray%20tsunade%20uncensored%20vampire_(game)%20wagashi%20wakfu%20white_hair%20yoshitake_rika%20yuffie_kisaragi

These both work. This is not limited to Patreon supporters or registered users. Everyone can do this.



deltaslayer - Group: Member - Total Posts: 8
user_avatar
Posted on: 02/27/18 03:15AM

I have no idea why I always thought there's a limit on the tags you can search by ... maybe there was no result for the set of tags I randomly tried. Thanks for all the examples above, they'll help a lot for setting the workaround for my app.



Hinata_2-8 - Group: Member - Total Posts: 74
user_avatar
Posted on: 04/30/18 02:06PM

deltaslayer said:
Hi,

I currently understand the purpose of limiting paging navigation to 1000 (I believe it's a standard in favor of SEO or something?). However, can anyone advise about the purpose of the limitation via the API?

The purpose of going for an API was solely to allow browsing over this limit, i.e say tag pokemon, even if, filtered by two tags, may result in an amount of images crossing the 1000 page limit. I tried browsing past over 1000 without any tags, it lets me reach up to 1212 so far, but if I type 1500 I have an error "Too deep! Pull it back some. Holy fuck.". Is it because of the performance/cost over the database? Are there any means to work around as in, perform such kind of query without being consuming on the database/server?

In my experience, I've observed other APIs having such limitations, however, I've worked on laravel and notice there's no such limitation and the paging seems to work fine even on a very huge list.


I tried in Hatsune_Miku, error if you try to go to the end.



Jerl - Group: The Real Administrator - Total Posts: 6706
user_avatar
Posted on: 04/30/18 05:34PM

As has been stated, that's intentional. If you read the FAQ, it tells you how to get around it.



add_replyAdd Reply


1 2