Tomcat部署后,BI更新数据时要么很慢,要么卡死。有没有人遇到这情况?

工程在Tomcat部署后,BI更新数据时要么很慢,要么卡死。有没有人遇到这情况?

image.png

以下是部份日志:

13:43:37 Executor task launch worker for task 12 ERROR [standard]  empty String

java.lang.NumberFormatException: empty String

        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)

        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)

        at java.lang.Double.parseDouble(Double.java:538)

        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)

        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)

        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)

        at org.nfunk.jep.JEP.evaluate(JEP.java:635)

        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)

        at com.finebi.jep.Jep.evaluate(Jep.java:64)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)

        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)

        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)

        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)

        at scala.collection.Iterator$class.foreach(Iterator.scala:893)

        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)

        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)

        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

        at org.apache.spark.scheduler.Task.run(Task.scala:109)

        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)

        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

        at java.lang.Thread.run(Thread.java:748)

13:43:37 Executor task launch worker for task 12 ERROR [standard]  empty String

java.lang.NumberFormatException: empty String

        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)

        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)

        at java.lang.Double.parseDouble(Double.java:538)

        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)

        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)

        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)

        at org.nfunk.jep.JEP.evaluate(JEP.java:635)

        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)

        at com.finebi.jep.Jep.evaluate(Jep.java:64)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)

        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)

        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)

        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)

        at scala.collection.Iterator$class.foreach(Iterator.scala:893)

        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)

        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)

        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

        at org.apache.spark.scheduler.Task.run(Task.scala:109)

        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)

        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

        at java.lang.Thread.run(Thread.java:748)

13:43:37 http-nio-8080-exec-3 DEBUG [standard]  Database session opened

13:43:37 Executor task launch worker for task 12 ERROR [standard]  empty String

java.lang.NumberFormatException: empty String

        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)

        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)

        at java.lang.Double.parseDouble(Double.java:538)

        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)

        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)

        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)

        at org.nfunk.jep.JEP.evaluate(JEP.java:635)

        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)

        at com.finebi.jep.Jep.evaluate(Jep.java:64)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)

        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)

        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)

        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)

        at scala.collection.Iterator$class.foreach(Iterator.scala:893)

        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)

        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)

        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

        at org.apache.spark.scheduler.Task.run(Task.scala:109)

        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)

        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

        at java.lang.Thread.run(Thread.java:748)

13:43:37 http-nio-8080-exec-3 DEBUG [standard]  Found user by condition QueryConditionImpl{restriction=Restriction{type=AND, restrictions=[Restriction{type=EQ, column=userName, value=luojian0323}]}, skip=0, count=0, sort=[]}

13:43:37 Executor task launch worker for task 12 ERROR [standard]  empty String

java.lang.NumberFormatException: empty String

        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)

        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)

        at java.lang.Double.parseDouble(Double.java:538)

        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)

        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)

        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)

        at org.nfunk.jep.JEP.evaluate(JEP.java:635)

        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)

        at com.finebi.jep.Jep.evaluate(Jep.java:64)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)

        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)

        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)

        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)

        at scala.collection.Iterator$class.foreach(Iterator.scala:893)

        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)

        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)

        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

        at org.apache.spark.scheduler.Task.run(Task.scala:109)

        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)

        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

        at java.lang.Thread.run(Thread.java:748)

13:43:37 http-nio-8080-exec-3 DEBUG [standard]  Database session closed

13:43:37 Executor task launch worker for task 12 ERROR [standard]  empty String

java.lang.NumberFormatException: empty String

        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)

        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)

        at java.lang.Double.parseDouble(Double.java:538)

        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)

        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)

        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)

        at org.nfunk.jep.JEP.evaluate(JEP.java:635)

        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)

        at com.finebi.jep.Jep.evaluate(Jep.java:64)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)

        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)

        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)

        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)

        at scala.collection.Iterator$class.foreach(Iterator.scala:893)

        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)

        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)

        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

        at org.apache.spark.scheduler.Task.run(Task.scala:109)

        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)

        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

        at java.lang.Thread.run(Thread.java:748)

13:43:37 http-nio-8080-exec-3 DEBUG [standard]  Query action using 6 ms

13:43:37 Executor task launch worker for task 12 ERROR [standard]  empty String

java.lang.NumberFormatException: empty String

        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)

        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)

        at java.lang.Double.parseDouble(Double.java:538)

        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)

        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)

        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)

        at org.nfunk.jep.JEP.evaluate(JEP.java:635)

        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)

        at com.finebi.jep.Jep.evaluate(Jep.java:64)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)

        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)

        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)

        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)

        at scala.collection.Iterator$class.foreach(Iterator.scala:893)

        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)

        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)

        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

        at org.apache.spark.scheduler.Task.run(Task.scala:109)

        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)

        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

        at java.lang.Thread.run(Thread.java:748)

13:43:37 http-nio-8080-exec-3 DEBUG [standard]  Database session opened

13:43:37 Executor task launch worker for task 12 ERROR [standard]  empty String

java.lang.NumberFormatException: empty String

        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)

        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)

        at java.lang.Double.parseDouble(Double.java:538)

        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)

        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)

        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)

        at org.nfunk.jep.JEP.evaluate(JEP.java:635)

        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)

        at com.finebi.jep.Jep.evaluate(Jep.java:64)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)

        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)

        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)

        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)

        at scala.collection.Iterator$class.foreach(Iterator.scala:893)

        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)

        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)

        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

        at org.apache.spark.scheduler.Task.run(Task.scala:109)

        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)

        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

        at java.lang.Thread.run(Thread.java:748)

13:43:37 http-nio-8080-exec-3 DEBUG [standard]  Found user by condition QueryConditionImpl{restriction=Restriction{type=AND, restrictions=[Restriction{type=EQ, column=userName, value=luojian0323}]}, skip=0, count=0, sort=[]}

13:43:37 Executor task launch worker for task 12 ERROR [standard]  empty String

java.lang.NumberFormatException: empty String

        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)

        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)

        at java.lang.Double.parseDouble(Double.java:538)

        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)

        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)

        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)

        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)

        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)

        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)

        at org.nfunk.jep.JEP.evaluate(JEP.java:635)

        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)

        at com.finebi.jep.Jep.evaluate(Jep.java:64)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)

        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)

        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)

        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)

        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)

        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)

        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)

        at scala.collection.Iterator$class.foreach(Iterator.scala:893)

        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)

        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)

        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)

        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)

        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

        at org.apache.spark.scheduler.Task.run(Task.scala:109)

        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)

        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

        at java.lang.Thread.run(Thread.java:748)


FineBI luojian0323 发布于 2020-1-1 13:44 (编辑于 2020-1-2 09:49)
1min目标场景问卷 立即参与
回答问题
悬赏:5 F币 + 添加悬赏
提示:增加悬赏、完善问题、追问等操作,可使您的问题被置顶,并向所有关注者发送通知
共3回答
最佳回答
3
陈星Lv6初级互助
发布于2020-1-2 10:10(编辑于 2020-1-2 10:11)

不知道符不符合要求

image.png

  • luojian0323 luojian0323(提问者) 不对。我这个已经勾上了。
    2020-01-02 10:14 
  • 陈星 陈星 回复 luojian0323(提问者) 取消勾选
    2020-01-02 10:18 
最佳回答
1
snrtuemcLv8专家互助
发布于2020-6-8 19:02

你这个是原来应给接受数字型数据的字段,赋值字符型了,导致转换出错

参考https://www.cnblogs.com/hualidezhuanshen/archive/2013/06/12/3132745.html

确保输入类型是准确的

最佳回答
0
小歆嵩Lv7初级互助
发布于2020-1-2 10:00

像是内存问题,重新启动tomcat呢!

  • 4关注人数
  • 583浏览人数
  • 最后回答于:2020-6-8 19:02
    请选择关闭问题的原因
    确定 取消
    返回顶部