Pagination API with query retunes duplicat issues

Hey im 
executing the get issue API with Pagination but from what i see in the log it return me duplication of issues 
as i print out the issue id from each iteration 

this is how the API looks like : starting from

0-1000 
https://track.personetics.com/youtrack/api/issues?fields=id,created,summary,date,idReadable,customFields(id,projectCustomField(field(name)))&$skip=0&$top=1000&query=has: attachments created: 2016-01-01 .. 2017-12-30

1000-2000 

https://track.personetics.com/youtrack/api/issues?fields=id,created,summary,date,idReadable,customFields(id,projectCustomField(field(name)))&$skip=1000&$top=2000&query=has: attachments created: 2016-01-01 .. 2017-12-30

2000-3000

https://track.personetics.com/youtrack/api/issues?fields=id,created,summary,date,idReadable,customFields(id,projectCustomField(field(name)))&$skip=2000&$top=3000&query=has: attachments created: 2016-01-01 .. 2017-12-30

 

and so on until there is no issues in between dates 
but i can see the duplicate issues returning and the issue date it is scanning not ordered 
some times i can see it is from 2016 some times from 2017
very strange ... 

 

 

5 comments
Comment actions Permalink
Official comment

Hi!

I'm Sergey from the Youtrack team.

Thank you for the post. Let's try to sort that out.

Firstly, please make sure to encode the URI by adjusting the query for HTML symbols escaping. For example, has: attachments created: 2016-01-01 .. 2017-12-30 should be has%3A%20attachments%20created%3A%202016-01-01%20..%202017-12-30 in the URI. 

Secondly, you use the pagination incorrectly in your example. 

>0-1000 

In this case, the top value in the URI should be 1000, not 5 as it is in your example

>1000-2000 

No, in your example you iterate through 1000-3000, as you skip here the first 1000 and call the next 2000 on top of that. So you get 1000 issues from your first request and another 2000 from this request 

>2000-3000

You should skip the first 3000, as you already retrieved them in the first two requests. You can leave the top value as 3000 to get 3000 more, that's up to you here. 

So basically, you retrieve the same issues over different calls and that's how you get the same IDs.

If any questions appear, let me know. 


Comment actions Permalink

@Sergey Merzlov
i fixed the question is was copy / paste mistake 
the URL is well encoded, i just pasted the original request 
now to be clear I'm receiving the issues but there are duplicates 

0
Comment actions Permalink

Thanks for your reply.

>now to be clear I'm receiving the issues but there are duplicates 

Sorry, did you see my reply about the pagination part? 

0
Comment actions Permalink

@Sergey Merzlov

Thanks!
its confusing 
so my iteration should look like this? 
0 - 1000

1000-1000

2000-1000

3000-1000

and so on ... if like to get bulks of 1000 each time?

0
Comment actions Permalink

Yes, that's right. 

0

Please sign in to leave a comment.