如何在apache的flink中拆分和合并数据(向量),而不使用windows

2exbekwf  于 2021-06-21  发布在  Flink
关注(0)|答案(1)|浏览(317)

我需要将一个整数立方体拆分为向量,对每个向量执行一些操作(比如简单的加法),然后将向量合并回一个立方体。向量操作应并行执行(即,每个流的向量)。多维数据集是包含id的对象。
我可以将多维数据集拆分为向量,使用多维数据集id创建一个元组,然后使用keyby(id),并为每个多维数据集的向量创建一个分区。然而,似乎我必须使用一个窗口的某个时间单位来做到这一点。这个应用程序对延迟非常敏感,所以我更喜欢在向量到达时组合它们,也许使用某种逻辑时钟(我知道一个多维数据集中有多少个向量),当最后一个向量到达时,将重新组装的多维数据集发送到下游。这在Flink可能吗?
下面是一段代码片段,举例说明了这个想法:

//Stream topology..
final StreamExecutionEnvironment env =
        StreamExecutionEnvironment.getExecutionEnvironment();

DataStream<Cube> stream = env
    //Take cubes from collection and send downstream
    .fromCollection(cubes)
    //Split the cube(int[][][]) to vectors(int[]) and send downstream
    .flatMap(new VSplitter()) //returns tuple with id at pos 1
    .keyBy(1)
    //For each value in each vector element, add its value with one.
    .map(new MapFunction<Tuple2<CubeVector, Integer>, Tuple2<CubeVector, Integer>>() {
        @Override
        public Tuple2<CubeVector, Integer> map(Tuple2<CubeVector, Integer> cVec) throws Exception {
            CubeVector cv = cVec.getField(0);
            cv.cubeVectorAdd(1);
            cVec.setField(cv, 0);
            return cVec;
        }
    })

    //**Merge vectors back to a cube**//

    .
    .
    .

//The cube splitter to vectors..
public static class VSplitter implements FlatMapFunction<Cube, Tuple2<CubeVector, Integer>> {
    @Override
    public void flatMap(Cube cube, Collector<Tuple2<CubeVector, Integer>> out) throws Exception {
        for (CubeVector cv : cubeVSplit(cube)) {
            //out.assignTimestamp()
            out.collect(new Tuple2<CubeVector, Integer>(cv, cube.getId()));
        }
    }
}
sc4hvdpw

sc4hvdpw1#

你需要一个 FlatMapFunction 它不断地附加 CubeVectors 直到它看到足够的 CubeVectors 重建 Cube . 下面的代码段应该可以做到这一点:

DataStream<Tuple2<CubeVector, Integer> input = ...

input.keyBy(1).flatMap(
    new RichFlatMapFunction<Tuple2<CubeVector, Integer>, Cube> {

        private static final ListStateDescriptor<CubeVector> cubeVectorsStateDescriptor = new ListStateDescriptor<CubeVector>(
                "cubeVectors",
                new CubeVectorTypeInformation());

        private static final ValueStateDescriptor<Integer> cubeVectorCounterDescriptor = new ValueStateDescriptor<>(
                "cubeVectorCounter",
                BasicTypeInfo.INT_TYPE_INFO);

        private ListState<CubeVector> cubeVectors;

        private ValueState<Integer> cubeVectorCounter;

        @Override
        public void open(Configuration parameters) {
            cubeVectors = getRuntimeContext().getListState(cubeVectorsStateDescriptor);
            cubeVectorCounter = getRuntimeContext().getState(cubeVectorCounterDescriptor);
        }

        @Override
        public void flatMap(Tuple2<CubeVector, Integer> cubeVectorIntegerTuple2, Collector<Cube> collector) throws Exception {
            cubeVectors.add(cubeVectorIntegerTuple2.f0);
            final int oldCounterValue = cubeVectorCounter.value();

            final int newCounterValue = oldCounterValue + 1;

            if (newCounterValue == NUMBER_CUBE_VECTORS) {
                Cube cube = createCube(cubeVectors.get());

                cubeVectors.clear();
                cubeVectorCounter.update(0);

                collector.collect(cube);
            } else {
                cubeVectorCounter.update(newCounterValue);
            }
        }
    });

相关问题