Unable to run pyspark -
i installed spark on windows, , i'm unable start pyspark
. when type in c:\spark\bin\pyspark
, following error:
python 3.6.0 |anaconda custom (64-bit)| (default, dec 23 2016, 11:57:41) [msc v.1900 64 bit (amd64)] on win32 type "help", "copyright", "credits" or "license" more information. traceback (most recent call last): file "c:\spark\bin..\python\pyspark\shell.py", line 30, in import pyspark file "c:\spark\python\pyspark__init__.py", line 44, in pyspark.context import sparkcontext file "c:\spark\python\pyspark\context.py", line 36, in pyspark.java_gateway import launch_gateway file "c:\spark\python\pyspark\java_gateway.py", line 31, in py4j.java_gateway import java_import, javagateway, gatewayclient file "", line 961, in _find_and_load file "", line 950, in _find_and_load_unlocked file "", line 646, in _load_unlocked file "", line 616, in _load_backward_compatible file "c:\spark\python\lib\py4j-0.10.4-src.zip\py4j\java_gateway.py", line 18, in file "c:\users\eigenaar\anaconda3\lib\pydoc.py", line 62, in import pkgutil file "c:\users\eigenaar\anaconda3\lib\pkgutil.py", line 22, in moduleinfo = namedtuple('moduleinfo', 'module_finder name ispkg') file "c:\spark\python\pyspark\serializers.py", line 393, in namedtuple cls = _old_namedtuple(*args, **kwargs) typeerror: namedtuple() missing 3 required keyword-only arguments: 'verbose', 'rename', , 'module'
what doing wrong here?
spark 2.1.0 doesn't support python 3.6.0. solve change python version in anaconda environment. run following command in anaconda env
conda create -n py35 python=3.5 anaconda activate py35
Comments
Post a Comment