0
votes

When I run the flink scala REPL script in java cannot compile.

I tried this java code to run Flink scala REPL for test, bug always exception.

    Settings settings = new Settings();
    ((MutableSettings.BooleanSetting) settings.usejavacp()).value_$eq(true);
    IMain main = new IMain(settings, new PrintWriter(System.out));
    // Thread.currentThread().setContextClassLoader(main.classLoader());
    for (String imp : imports) {
        main.interpret(MessageFormat.format("import {0}", imp));
    }
    ExecutionEnvironment env = ExecutionEnvironment.createLocalEnvironment();
    String script = FileUtils.readFileToString(new File("/opt/project/security-detection/sappo/src/sappo-interpreter/src/test/resources/demo.txt"), StandardCharsets.UTF_8);
    main.bind(new NamedParamClass("env", ExecutionEnvironment.class.getName(), env));
    main.interpret(script); 

scala text

    val text = env.fromElements("Who's there?", "I think I hear them. Stand, ho! Who's there?")

    // result 1
    val counts = text.flatMap { _.toLowerCase.split("\\W+") filter { _.nonEmpty } } map { (_, 1) } groupBy(0) sum(1)
    counts.print()

    // result 2
    val counts = text.map((x:String) => 1)
    counts.print()

    // result 3
    text.print()

result 1

    import org.apache.flink.core.fs._
    import org.apache.flink.core.fs.local._
    import org.apache.flink.api.common.io._
    import org.apache.flink.api.common.aggregators._
    import org.apache.flink.api.common.accumulators._
    import org.apache.flink.api.common.distributions._
    import org.apache.flink.api.common.operators._
    import org.apache.flink.api.common.operators.base.JoinOperatorBase.JoinHint
    import org.apache.flink.api.common.functions._
    import org.apache.flink.api.java.io._
    import org.apache.flink.api.java.aggregation._
    import org.apache.flink.api.java.functions._
    import org.apache.flink.api.java.operators._
    import org.apache.flink.api.java.sampling._
    import org.apache.flink.api.scala._
    import org.apache.flink.api.scala.utils._
    import org.apache.flink.streaming.api.scala._
    import org.apache.flink.streaming.api.windowing.time._
    env: org.apache.flink.api.java.ExecutionEnvironment = Local Environment (parallelism = 8) : ee335d29eefca69ee5fe7279414fc534
    console:67: error: missing parameter type for expanded function ((x$1) => x$1.toLowerCase.split("\\W+").filter(((x$2) => x$2.nonEmpty)))
           val counts = text.flatMap { _.toLowerCase.split("\\W+") filter { _.nonEmpty } } map { (_, 1) } groupBy(0) sum(1)

result 2

    import org.apache.flink.core.fs._
    import org.apache.flink.core.fs.local._
    import org.apache.flink.api.common.io._
    import org.apache.flink.api.common.aggregators._
    import org.apache.flink.api.common.accumulators._
    import org.apache.flink.api.common.distributions._
    import org.apache.flink.api.common.operators._
    import org.apache.flink.api.common.operators.base.JoinOperatorBase.JoinHint
    import org.apache.flink.api.common.functions._
    import org.apache.flink.api.java.io._
    import org.apache.flink.api.java.aggregation._
    import org.apache.flink.api.java.functions._
    import org.apache.flink.api.java.operators._
    import org.apache.flink.api.java.sampling._
    import org.apache.flink.api.scala._
    import org.apache.flink.api.scala.utils._
    import org.apache.flink.streaming.api.scala._
    import org.apache.flink.streaming.api.windowing.time._
    env: org.apache.flink.api.java.ExecutionEnvironment = Local Environment (parallelism = 8) : 5cbf8e476ebf32fd8fdf91766bd40af0
    console:71: error: type mismatch;
     found   : String => Int
     required: org.apache.flink.api.common.functions.MapFunction[String,?]
           val counts = text.map((x:String) => 1)

result 3

    import org.apache.flink.core.fs._
    import org.apache.flink.core.fs.local._
    import org.apache.flink.api.common.io._
    import org.apache.flink.api.common.aggregators._
    import org.apache.flink.api.common.accumulators._
    import org.apache.flink.api.common.distributions._
    import org.apache.flink.api.common.operators._
    import org.apache.flink.api.common.operators.base.JoinOperatorBase.JoinHint
    import org.apache.flink.api.common.functions._
    import org.apache.flink.api.java.io._
    import org.apache.flink.api.java.aggregation._
    import org.apache.flink.api.java.functions._
    import org.apache.flink.api.java.operators._
    import org.apache.flink.api.java.sampling._
    import org.apache.flink.api.scala._
    import org.apache.flink.api.scala.utils._
    import org.apache.flink.streaming.api.scala._
    import org.apache.flink.streaming.api.windowing.time._
    env: org.apache.flink.api.java.ExecutionEnvironment = Local Environment (parallelism = 8) : ee335d29eefca69ee5fe7279414fc534
    Who's there?
    I think I hear them. Stand, ho! Who's there?
    text: org.apache.flink.api.java.operators.DataSource[String] = org.apache.flink.api.java.operators.DataSource@53e28097
    PASSED: testIMain
    PASSED: testIMainScript

1
I'm not an expert in flink... but it looks like you are trying to execute Scala code in a Java interpreter - that would explain the error message. BTW, my main clue for that is this "org.apache.flink.api.java.ExecutionEnvironment"Luis Miguel Mejía Suárez
@LuisMiguelMejíaSuárez Yes,this is a import error ~ thanks.Liu Zheng

1 Answers

0
votes

Try using the Scala REPL that comes with Flink:

$ bin/start-scala-shell.sh local

I tried the three examples you shared (with Flink 1.7.0), and they all worked just fine.