开发者社区> 问答> 正文

Filter with large key set

Hi there,

I have the following usecase: a key set say [A,B,C,....] with around 10M entries, the type of the entries can be one of the types in BasicTypeInfo, e.g. String, Long, Integer etc...

and each message looks like below: message: { header: A body: {} }

I would like to use Flink to filter each message' header field, to see if the value present in the key set.

*The key set needs to be dynamic, meaning at any time, we can perform add/read/delete operations on the key set. *

Any suggestions are very welcome!*来自志愿者整理的flink邮件归档

展开
收起
彗星halation 2021-12-02 17:52:15 623 0
1 条回答
写回答
取消 提交回答
  • Hi Eleanore,

    A dynamic filter like the one you need, is essentially a join operation. There is two ways to do this:

    • partitioning the key set and the message on the attribute. This would be done with a KeyedCoProcessFunction.
    • broadcasting the key set and just locally forwarding the messages. This would be done with a KeyedBroadcastProcessFunction.

    The challenge in your application is that the key set entries have different types which is something that Flink does not very well support. There is two ways to go about this:

    1) route all data through the same operators that can handle all types. You can model this with an n-ary Either type. Flink only has a binary Either type, so you would need to implement the TypeInformation, serializer, and comparator yourself. The Either classes should give you good guidance for that. 2) have different operators and flows for each basic data type. This will fan out your job, but should be the easier approach.*来自志愿者整理的FLINK邮件归档

    2021-12-02 18:11:50
    赞同 展开评论 打赏
问答标签:
问答地址:
问答排行榜
最热
最新

相关电子书

更多
Large Scale Data Files,Object 立即下载
Semantic Search--Fast Results from Large,Foreign Language Corpora 立即下载
Froma single droplet toafull b 立即下载