Why is new mexico crime rate so highSpark dataframe iterate rows scala. Iterate rows and columns in Spark dataframe, You can convert Row to Seq with toSeq . Once turned to Seq you can iterate over it as usual with foreach , map or whatever you need The row variable will contain each row of Dataframe of rdd row type.Canadian forces interviewIf you have a small dataset, you can also Convert PySpark DataFrame to Pandas and use pandas to iterate through. Use spark.sql.execution.arrow.enabled config to enable Apache Arrow with Spark. Apache Spark uses Apache Arrow which is an in-memory columnar format to transfer the data between Python and JVM.