All Questions
Tagged with apache-flink flink-table-api
84 questions
2
votes
0
answers
35
views
Flink SQL issue with unnest - output is changed by function
I am trying to work with api.countrylayer.com payload which returns a JSON list of countries. The JSON batch list should be split by each country for further processing. Therefore I am using the ...
1
vote
1
answer
28
views
How sqlExecute queries run in Apache Flink when triggered via proccessFunction?? How are the SQL Tasks managed?
Context:
So, I am trying to build a Flink application that runs rules dynamically. I have a rule Stream from where SQL rules are written, which Flink reads from and executes. I have connected the ...
0
votes
0
answers
17
views
Flink table api leftouter join is fetching records with null for matching records on right side
Table result = s.leftouterjoin(t,$("key").isequal("t_key")).where(
$("t_key").isnull()). select(key,value, date);
Table s = tableenv.from("source");
Table t = ...
0
votes
0
answers
37
views
Need to restart Flink Batch job on job restart
I have a flink streaming application. For validation, I need to reference few tables in my postgres Table. For this i use tableApi within the datastreaming application. Everything works fine on the ...
1
vote
2
answers
111
views
How to fetch Text column of PostgreSQL from Flink Table API
I am getting the following exception when trying to fetch a text column from a table in PostgreSQL. How to resolve it.
Table schema :
Snapshot of data :
Table resultTable = tenv.sqlQuery(
...
0
votes
0
answers
71
views
Flink table API state management
I have few big MySQL tables, which I'd like to join between each other to produce result. I also want to receive infrequently coming updates for records which was created during previous year.
Current ...
0
votes
0
answers
71
views
TableAggregate user defined function in flink python emit function call times is different from java udf emit call times
I am testing the TableAggregateFunction in flink as Java udf and Python udf. The Java udf emit function is called after intermediate aggregation values, where as for python udf the emit function call ...
0
votes
0
answers
28
views
How to manage delay in loading from kafka topic into table using table api in pyflink
i am loading data from two kafka topics into tables using table api in pyflink and going to do join operation what will happen if the data from one of the kafka topics loaded into the corresponding ...
1
vote
2
answers
311
views
Cannot upgrade Flink SQL job to 1.18 because the Calc and ChangelogNormalize order changed
Context
We have Flink jobs running in a Flink cluster using version 1.15.2. Those jobs are composed of:
One or several KafkaSource, from which we create a changelog stream with a primary key.
An SQL ...
4
votes
0
answers
221
views
How to read Parquet files from S3 Ceph in Java using Flink 1.16.0 Table API?
I am trying to read Parquet files from S3 Ceph in Java using flink 1.16.0 and Table API FileSystem connector but encountering the following error:
java.lang.NoClassDefFoundError: org/apache/hadoop/...
0
votes
1
answer
416
views
Flink Table API program does not compile when assigning watermark using a field converted with UDF
Since TO_TIMESTAMP(value, format) method in Flink Table API does not support custom formats like yyyyMMddHHmmssSSS, we needed to create a UDF(User Defined Function) for custom conversion.
However, ...
0
votes
1
answer
760
views
job client must be a coordinationrequestgateway. this is a bug
We are using flink tableApi and while performing executeQuery operation, we run into this issue .
Basically everything works completely fine from local but when we run the same application using on ...
1
vote
0
answers
118
views
await method on TableResult is not working when job is submitted via Session Mode using Apache Flink Operator
await method on TableResult is not working when job is submitted via Session Mode using Apache Flink Operator by creating FlinkSessionJob resource in kubernetes. The same code is working when the job ...
-1
votes
1
answer
532
views
How can I map each field of an event to a separate column when creating a table using Flink Table API from a data-stream of Confluent Avro?
How to create table using Table Api in flink for modelling
I am working on creating a table using Table API from the data-stream of confluent-avro (from kafka topic). I am trying using the below code (...
1
vote
0
answers
199
views
Create separate Table for each key in Data Stream after doing keyBy operation in Flink
My requirement is to create separate tables for each key in two different data streams and then join them. I have successfully created two separate tables from the data streams in Flink and performed ...