Skip to content

Tag: apache-spark-sql

Pyspark- Subquery in a case statement

I am trying to run a subquery inside a case statement in Pyspark and it is throwing an exception. I am trying to create a new flag if id in one table is present in a different table. Is this even possible in pyspark? Here is the error: I am using Spark 2.2.1. Answer This appears to be the latest