-
May 28th, 2007, 02:58 PM
#3
Now it is clear for me how buffering works.
I have the following situation for which I would like to propose a feature request:
I have a very high update frequency item that a client cannot keep up receiving all of them in DISTINCT mode without loosing sometimes some records. That's for a fact.
I saw that another streaming solution does the following for streaming executed orders(price, quantity and timestamp):
- If a number of consecutive records have to be discarded (in our case, because the buffer at the client or the server side reached the alllowed limit), it groups all of them in a way similar to this SQL query:
select price, max(timestamp), sum(quantity)
group by price
Very often, in case there is an important burst of updates at the same time, they will have the same price, so they will be grouped under the same item or two and this resolves the issue.
The idea here is that LS invokes a dataAdapter or metaAdapter function with the to-be discarded records as input and it returns a smaller list based on custom logic (in my case, as explained in the above SQL).
I thought about this in the data adapter, but it turns out to be difficult (it means I have to hold the listener.update() API until I receive enough events...)
What do you think ?
Thanks
A
Similar Threads
-
By hungtt in forum General
Replies: 1
Last Post: January 4th, 2011, 12:07 PM
-
By jcroston in forum Client SDKs
Replies: 4
Last Post: October 7th, 2008, 04:06 PM
-
By rsouissi in forum Client SDKs
Replies: 4
Last Post: March 6th, 2008, 01:59 PM
-
By gmccone in forum Client SDKs
Replies: 1
Last Post: March 30th, 2007, 10:41 AM
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
All times are GMT +1. The time now is 05:05 AM.
Bookmarks