You are not logged in.
Pages: 1
I recently installed Apache Spark on my laptop. However, it stopped working after I updated my system recently (packer -Syu). When I try running pyspark now, I'm getting 'Unsupported major.minor version 52.0'. From SO* I found out this number (52) means I need Java version 8, so I installed that with packer. However, the error persists. I have no clue how to proceed from here, so any help is appreciated.
* https://stackoverflow.com/questions/103 … inor-versi
Last edited by ahura (2017-07-19 15:04:29)
Offline
I need Java version 8, so I installed that with packer.
So, which Java / JRE / JDK / Oracle Java / OpenJDK versions of Java do you currently have installed? More than one version? From the official repos, or the AUR?
Last edited by drcouzelis (2017-07-19 14:39:21)
Offline
I have (output from pacman -Q | grep jdk):
jdk-devel 9b178-1
jdk8-openjdk 8.u131-1
jre7-openjdk-headless 7.u131_2.6.9-1
jre8-openjdk 8.u131-1
jre8-openjdk-headless 8.u131-1
All installed with packer.
However, 'java -version' gives 'java version "1.7.0_131" ' so maybe the problem is that it still uses version 7 by default?
Offline
However, 'java -version' gives 'java version "1.7.0_131" ' so maybe the problem is that it still uses version 7 by default?
Yes, I think that is the problem.
It looks like there's a setting for that.
Offline
Omg, that actually did it. Thanks so much!
Offline
It looks like there's a setting for that.
Beauty! Thank you!
Offline
Pages: 1