Results 1 to 7 of 7
  1. #1
    Senior Member
    Join Date
    Nov 2006
    Location
    Riyadh
    Posts
    33

    Problem with DISTINCT subscription and very high frequency updates

    Hi all,

    I have an item that represents the trades/transactions of a stock item. It is subscribed as DISTINCT in order to get the transactions one by one in detail. The adapter sometimes produces very high frequency update for such item (the meta adapter specifies 20 items/sec as a value for the MaxFrequency parameter but the adapter sometimes produces 100-200 items/sec for this item).

    The problem is that the client perceives a very high delay because the server delivers max 20 items/sec (as expected) but does it discard intermediate items in order to privilege most recent updates.

    How to improve this without increasing MaxFrequency/Bandwidth ?

    Thanks
    A

  2. #2
    Administrator
    Join Date
    Jul 2006
    Location
    Milan
    Posts
    972
    Hi,

    In DISTINCT mode, the default for the internal buffer size is a potentially unlimited buffer (while, in MERGE mode, the default is a buffer of size 1).
    Having a large buffer for an item can be useful when updates for the item may come in bursts, so that all updates in the burst are queued and dispatched one at a time. This, however, is true only provided that the mean update frequency from the Data Adapter is much lower than the frequency limit imposed for the item. Otherwise, the buffer is always full and this causes latencies.
    So, try lowering the buffer size, either in the Metadata Adapter configuration or in the client Table configuration.

    Just in case, consider that the main differences between the MERGE and DISTINCT modes are about the interpretation of the updates coming from the Data Adapter, rather than about update delivery.
    In particular, missing fields from the updates received from the Data Adapter are treated as unchanged fields in MERGE and as null fields in DISTINCT.
    Moreover, the snapshot is maintained as the last (merged) update in MERGE mode and as a historical list of updates in DISTINCT.
    So, make sure that the DISTINCT mode is suitable for your case.

    Dario

  3. #3
    Senior Member
    Join Date
    Nov 2006
    Location
    Riyadh
    Posts
    33
    Now it is clear for me how buffering works.

    I have the following situation for which I would like to propose a feature request:

    I have a very high update frequency item that a client cannot keep up receiving all of them in DISTINCT mode without loosing sometimes some records. That's for a fact.

    I saw that another streaming solution does the following for streaming executed orders(price, quantity and timestamp):
    - If a number of consecutive records have to be discarded (in our case, because the buffer at the client or the server side reached the alllowed limit), it groups all of them in a way similar to this SQL query:

    select price, max(timestamp), sum(quantity)
    group by price

    Very often, in case there is an important burst of updates at the same time, they will have the same price, so they will be grouped under the same item or two and this resolves the issue.

    The idea here is that LS invokes a dataAdapter or metaAdapter function with the to-be discarded records as input and it returns a smaller list based on custom logic (in my case, as explained in the above SQL).

    I thought about this in the data adapter, but it turns out to be difficult (it means I have to hold the listener.update() API until I receive enough events...)

    What do you think ?

    Thanks
    A

  4. #4
    Administrator
    Join Date
    Jul 2006
    Location
    Milan
    Posts
    972
    Hi,

    First of all: in the previous answer I forgot to mention the Preprocessor, which is where very high frequency items (which cannot be filtered by the Data Adapter) can be filtered before they reach the various session-related buffers. This may save some overload.

    However, no custom filtering mechanism is available.
    The introduction of special cases of field merging in case of filtering is already in our wish list, but more sophisticated kinds of filtering are not considered at the moment.
    Your case is not even a simple case of merging, because an update to be filtered would not necessarily be merged with the next one, but, based on the price, it could be merged with a more recent one, thus not preserving the order.
    We'll think about the suggestion.

    For your specific filtering requirement, one approach is still possible, though it complicates the Data Adapter somewhat:
    • The item is subscribed in COMMAND mode and the price is the KEY field.
    • The Data Adapter keeps the quantity for each price and adds a "cumulative quantity" field for each update. It dispatches updates for new prices as "ADD"s and updates for known prices as "UPDATE"s.
    • The Client also keeps the quantity for each price and, upon the reception of each update, finds the difference in the cumulative quantities. It also finds the last timestamp in the received update.
    • In order to avoid keeping in memory the state for prices which are no longer used, the Data Adapter could "garbage collect" prices by generating a "DELETE" event when a price has not been used for a timeout (long enough to be sure that the last event has been dispatched to all subscribed clients).

    Note that the Data Adapter does not introduce latencies on the updates and the filtering is still performed by Lightstreamer on a session base. Note, however, that the buffer size for each single price cannot be greater than 1.

    Dario

  5. #5
    Senior Member
    Join Date
    Nov 2006
    Location
    Riyadh
    Posts
    33
    Thanks for the info,

    My needs are simpler than that:

    Your case is not even a simple case of merging, because an update to be filtered would not necessarily be merged with the next one, but, based on the price, it could be merged with a more recent one, thus not preserving the order.
    We'll think about the suggestion.


    In fact yes, what I am looking for is for the update to be merged only with the next ones to provide a correct price execution order. For instance:

    10:00:00.001, price=15.25, quantity=20
    10:00:00.002, price=15.25, quantity=50
    10:00:00.010, price=15.25, quantity=100
    10:00:01.001, price=15.50, quantity=10
    10:00:01.020, price=15.50, quantity=100
    10:00:01.030, price=15.25, quantity=10
    10:00:01.040, price=15.25, quantity=20

    In this case, if this group is about to be discarded (usually it is much longer sequence where the drop happens), I would merge this way:

    10:00:00.010, price=15.25, quantity=170
    10:00:01.020, price=15.50, quantity=110
    10:00:01.040, price=15.25, quantity=30

    If the merged group is still too big to be delivered without drop, than that I can live with it. But this technique would solve most drop issues in reallity.

    Regards,
    A

  6. #6
    Administrator
    Join Date
    Jul 2006
    Location
    Milan
    Posts
    972
    ok,

    this is still not a simple case of merging (from Lightstreamer's point of view), because the custom code would take the decision whether to merge the new event or not to filter it at all. It could only be applied as an extension of unfiltered dispatching, not in a general case.

    However, in this case, even the approach shown in the previous answer would not be acceptable, as it would not preserve the update order.
    This kind of filtering could only be performed on the Data Adapter side. If a short time is allowed for keeping an update before sending it to Lightstreamer, some filtering may still be possible. In your example, by setting a maximum delay of a few milliseconds, you would still get 3 updates.

    Dario

  7. #7
    Senior Member
    Join Date
    Nov 2006
    Location
    Riyadh
    Posts
    33
    Ok, thanks for the reply.

    We will try to do it at the data adapter level then.

    Regards,
    A

 

 

Similar Threads

  1. Difference between DISTINCT and MERGE mode?
    By hungtt in forum General
    Replies: 1
    Last Post: January 4th, 2011, 12:07 PM
  2. Subscription Problem, I think
    By jcroston in forum Client APIs
    Replies: 4
    Last Post: October 7th, 2008, 04:06 PM
  3. DISTINCT updates are NULL with recent LS server
    By rsouissi in forum Client APIs
    Replies: 4
    Last Post: March 6th, 2008, 01:59 PM
  4. DISTINCT not supported in Stock Demo
    By gmccone in forum Client APIs
    Replies: 1
    Last Post: March 30th, 2007, 10:41 AM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
All times are GMT +1. The time now is 06:41 PM.