[ https://issues.apache.org/jira/browse/SPARK-10460?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Josh Rosen resolved SPARK-10460. -------------------------------- Resolution: Cannot Reproduce Row definitely has a fieldIndex method as of 1.4.1 (https://github.com/apache/spark/blob/v1.4.1/sql/catalyst/src/main/scala/org/apache/spark/sql/Row.scala#L327). I tried this out myself and it worked, so i think that there must be some other problem. Maybe you're actually running Spark 1.3.x? In any case, please comment / re-open this issue if you have additional information that can help us to reproduce and debug this problem. Thanks! > fieldIndex method missing on spark.sql.Row > ------------------------------------------ > > Key: SPARK-10460 > URL: https://issues.apache.org/jira/browse/SPARK-10460 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.4.1 > Environment: I'm running on an Ubuntu 14.04 32-bit machine, Java 7, > spark 1.4.1. Jar was created using sbt-assembly. I've tested both using spark > submit and in spark-shell. Both time I had errors in the exact same spot. > Reporter: FELIPE Q B ALMEIDA > Original Estimate: 48h > Remaining Estimate: 48h > > {code:title=foo.scala|borderStyle=solid} > val sc = new SparkContext(cnf) > > val sqlContext = new SQLContext(sc) > import sqlContext.implicits._ > // initializing the dataframe from json file > val reviewsDF = sqlContext.jsonFile(inputDir) > val schema = reviewsDF.schema > val cleanRDD = reviewsDF.rdd.filter{row:Row => > // > *************************************************************************** > //error: value fieldIndex is not a member of org.apache.spark.sql.row > val unixTimestampIndex = row.fieldIndex("unixReviewTime") > // > *************************************************************************** > val tryLong = Try(row.getLong(unixTimestampIndex)) > (row.anyNull == false && tryLong.isSuccess) > } > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org