[PySpark SQL]: SparkConf does not exist in the JVM

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

[PySpark SQL]: SparkConf does not exist in the JVM

takao
Hi,

`pyspark.sql.SparkSession.builder.getOrCreate()` gives me an error, and I
wonder if anyone can help me with this.

The line of code that gives me an error is

```
with spark_session(master, app_name) as session:
```

where spark_session is Python's context manager:

```
@contextlib.contextmanager
def spark_session(master, app_name):
    session = pyspark.sql.SparkSession.builder\
        .master(master).appName(app_name)\
        .config("spark.executorEnv.PYTHONPATH", os.getenv("PYTHONPATH"))\
        .getOrCreate()
    try:
        yield session
    finally:
        session.stop()
```

The error message is

```
/usr/local/lib/python3.6/site-packages/pyspark/sql/session.py:170: in
getOrCreate
    sparkConf = SparkConf()
/usr/local/lib/python3.6/site-packages/pyspark/conf.py:116: in __init__
    self._jconf = _jvm.SparkConf(loadDefaults)
py4j.protocol.Py4JError: SparkConf does not exist in the JVM
```

I am using Spark local cluster, and Spark's version is 2.3.2. PySpark is
also version 2.3.2.

Thanks in advance.
Takao



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]