请上传宽度大于 1200px,高度大于 164px 的封面图片
    调整图片尺寸与位置
    滚轮可以放大缩小图片尺寸,按住图片拖动可调整位置,多余的会自动被裁剪掉
取消
luojian0323(uid:169914)
帆软社群问答顾问; 入选2022年帆软产品致谢名单; 职业资格认证:FCP-FineBI | FCA-FineBI | FCP-报表开发工程师 | FCP-业务分析师
  • 1.mp4如题,FR能做成这样的报表填报吗?我只要红色区域固定在底端,蓝色区域可以插入新行,超出屏幕可以滚动显示。
  • contentPane.toolbar.options.items.fireEvent('click'); 不行啊contentPane.toolbar.getWidgetByName("StashButton").fireEvent('click');这个也不行(这个电脑端显示成功,但是没效果.实际没有数据暂存,移动端报错,如下图)contentPane.toolbar.getWidgetByName("Stash").fireEvent('click');这个也报错.  
  • 填报报表鼠标悬浮放大图片可以正常。但是代码用在决策报表就不适用了。哪们大神指点下?以下是填报报表用的代码: $("img").mouseover(function(e){var img=$(this).attr("src");  //获取鼠标当前所在单元格的row $("body").append("<div id='preview'><img src='"+ img +"'  /></div>");        //弹出一个div里面放着图片$("#preview")//修改这个div的样式,让他居中.css("-ms-transform","translate(-50%,-50%)").css("-moz-transform","translate(-50%,-50%)").css("-o-transform","translate(-50%,-50%)").css("transform","translate(-50%,-50%)").css("left","40%").css("top","10%").css("z-index","9999999").css("position","absolute")        .css("transform","scale(1)")                                    });    //鼠标离开这个div移除$("img").mouseout(function(e){                $("#preview").remove();    });        
  • 统计客户个数出了问题,在另外一台机器上没有这个错误,在这台机器上就报这错误.这台机器是独立部署,另外一台机器是tomcat部署.get sub dataModel error, java.lang.ClassCastException: com.finebi.spider.common.struct.columnstream.sequence.memory.MemorySeqStream cannot be cast to com.finebi.spider.common.struct.columnstream.sequence.memory.MemoryMultiStream 
  • 本想对以对话框 打开的报表执行键盘监听事件,但发现要点击一下报表页面才能执行。不然不执行。有没有什么办法做到,报表打开后就执行点击报表的效果。让键盘监听事件生效呢?不知道这样描述能理解吗?
  • 如下图,假如数据库中有这样的记录.要根据order_code为条件删除记录.决策报表显示的是以order_code为分类的汇总数据,如何做到按orer_code批量删除记录呢?  
  • 工程在Tomcat部署后,BI更新数据时要么很慢,要么卡死。有没有人遇到这情况?以下是部份日志:13:43:37 Executor task launch worker for task 12 ERROR   empty Stringjava.lang.NumberFormatException: empty String        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)        at java.lang.Double.parseDouble(Double.java:538)        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)        at org.nfunk.jep.JEP.evaluate(JEP.java:635)        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)        at com.finebi.jep.Jep.evaluate(Jep.java:64)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)        at scala.collection.Iterator$class.foreach(Iterator.scala:893)        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)        at org.apache.spark.scheduler.Task.run(Task.scala:109)        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)        at java.lang.Thread.run(Thread.java:748)13:43:37 Executor task launch worker for task 12 ERROR   empty Stringjava.lang.NumberFormatException: empty String        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)        at java.lang.Double.parseDouble(Double.java:538)        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)        at org.nfunk.jep.JEP.evaluate(JEP.java:635)        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)        at com.finebi.jep.Jep.evaluate(Jep.java:64)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)        at scala.collection.Iterator$class.foreach(Iterator.scala:893)        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)        at org.apache.spark.scheduler.Task.run(Task.scala:109)        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)        at java.lang.Thread.run(Thread.java:748)13:43:37 http-nio-8080-exec-3 DEBUG   Database session opened13:43:37 Executor task launch worker for task 12 ERROR   empty Stringjava.lang.NumberFormatException: empty String        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)        at java.lang.Double.parseDouble(Double.java:538)        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)        at org.nfunk.jep.JEP.evaluate(JEP.java:635)        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)        at com.finebi.jep.Jep.evaluate(Jep.java:64)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)        at scala.collection.Iterator$class.foreach(Iterator.scala:893)        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)        at org.apache.spark.scheduler.Task.run(Task.scala:109)        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)        at java.lang.Thread.run(Thread.java:748)13:43:37 http-nio-8080-exec-3 DEBUG   Found user by condition QueryConditionImpl{restriction=Restriction{type=AND, restrictions=}, skip=0, count=0, sort=}13:43:37 Executor task launch worker for task 12 ERROR   empty Stringjava.lang.NumberFormatException: empty String        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)        at java.lang.Double.parseDouble(Double.java:538)        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)        at org.nfunk.jep.JEP.evaluate(JEP.java:635)        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)        at com.finebi.jep.Jep.evaluate(Jep.java:64)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)        at scala.collection.Iterator$class.foreach(Iterator.scala:893)        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)        at org.apache.spark.scheduler.Task.run(Task.scala:109)        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)        at java.lang.Thread.run(Thread.java:748)13:43:37 http-nio-8080-exec-3 DEBUG   Database session closed13:43:37 Executor task launch worker for task 12 ERROR   empty Stringjava.lang.NumberFormatException: empty String        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)        at java.lang.Double.parseDouble(Double.java:538)        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)        at org.nfunk.jep.JEP.evaluate(JEP.java:635)        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)        at com.finebi.jep.Jep.evaluate(Jep.java:64)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)        at scala.collection.Iterator$class.foreach(Iterator.scala:893)        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)        at org.apache.spark.scheduler.Task.run(Task.scala:109)        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)        at java.lang.Thread.run(Thread.java:748)13:43:37 http-nio-8080-exec-3 DEBUG   Query action using 6 ms13:43:37 Executor task launch worker for task 12 ERROR   empty Stringjava.lang.NumberFormatException: empty String        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)        at java.lang.Double.parseDouble(Double.java:538)        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)        at org.nfunk.jep.JEP.evaluate(JEP.java:635)        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)        at com.finebi.jep.Jep.evaluate(Jep.java:64)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)        at scala.collection.Iterator$class.foreach(Iterator.scala:893)        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)        at org.apache.spark.scheduler.Task.run(Task.scala:109)        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)        at java.lang.Thread.run(Thread.java:748)13:43:37 http-nio-8080-exec-3 DEBUG   Database session opened13:43:37 Executor task launch worker for task 12 ERROR   empty Stringjava.lang.NumberFormatException: empty String        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)        at java.lang.Double.parseDouble(Double.java:538)        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)        at org.nfunk.jep.JEP.evaluate(JEP.java:635)        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)        at com.finebi.jep.Jep.evaluate(Jep.java:64)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)        at scala.collection.Iterator$class.foreach(Iterator.scala:893)        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)        at org.apache.spark.scheduler.Task.run(Task.scala:109)        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)        at java.lang.Thread.run(Thread.java:748)13:43:37 http-nio-8080-exec-3 DEBUG   Found user by condition QueryConditionImpl{restriction=Restriction{type=AND, restrictions=}, skip=0, count=0, sort=}13:43:37 Executor task launch worker for task 12 ERROR   empty Stringjava.lang.NumberFormatException: empty String        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)        at java.lang.Double.parseDouble(Double.java:538)        at com.fr.function.TODOUBLE.run(TODOUBLE.java:31)        at com.finebi.jep.function.custom.text.TODOUBLE.run(TODOUBLE.java:19)        at com.finebi.jep.function.AbstractFunction.run(AbstractFunction.java:49)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:273)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.SimpleNode.childrenAccept(SimpleNode.java:77)        at org.nfunk.jep.EvaluatorVisitor.visit(EvaluatorVisitor.java:258)        at org.nfunk.jep.ASTFunNode.jjtAccept(ASTFunNode.java:53)        at org.nfunk.jep.EvaluatorVisitor.getValue(EvaluatorVisitor.java:110)        at org.nfunk.jep.JEP.evaluate(JEP.java:635)        at com.finebi.jep.Jep.evaluateCheck(Jep.java:76)        at com.finebi.jep.Jep.evaluate(Jep.java:64)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:87)        at com.finebi.spider.sparksql.udf.JepFormulaUdfETL.call(JepFormulaUdfETL.java:21)        at org.apache.spark.sql.functions$$anonfun$21.apply(functions.scala:3616)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.serializefromobject_doConsume$(Unknown Source)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage13.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)        at com.finebi.spider.etl.job.spark.analysisfunction.ExtendRowIterator.hasNext(ExtendRowIterator.java:57)        at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage14.processNext(Unknown Source)        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)        at scala.collection.Iterator$class.foreach(Iterator.scala:893)        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:130)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2.scala:129)        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)        at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2.scala:135)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:79)        at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$2.apply(WriteToDataSourceV2.scala:78)        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)        at org.apache.spark.scheduler.Task.run(Task.scala:109)        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)        at java.lang.Thread.run(Thread.java:748)
  • 有公式可以实现吗?
  • 有大神,分析下这个错误提醒可能是什么原因引起的吗?每次出错。刷新页面又好了。但 是过段时间又会出现 这样的错误。.

99

9655

99

10

个人成就
内容被浏览3,141,428
加入社区6年145天
返回顶部