我用自制软件在我的mac上安装了spark。我正在尝试找到我安装它的目录。我需要在mac终端或spark shell中运行,以找到spark的安装目录?
更新:
码:
brew info apache-spark
输出:
apache-spark: stable 2.3.2, HEAD
Engine for large-scale data processing
https://spark.apache.org/
/usr/local/Cellar/apache-spark/2.3.2 (1,058 files, 244.6MB) *
Built from source on 2018-10-30 at 14:16:30
From: https://github.com/Homebrew/homebrew-core/blob/master/Formula/apache-spark.rb
==> Requirements
Required: java = 1.8 ✔
==> Options
--HEAD
Install HEAD version
==> Analytics
install: 4,534 (30 days), 14,340 (90 days), 56,698 (365 days)
install_on_request: 4,263 (30 days), 13,490 (90 days), 51,876 (365 days)
build_error: 0 (30 days)
码:
which spark-shell
输出:
/Users/sshields/anaconda2/bin/spark-shell
你应该使用brew info apache-spark,如果brew install它将在输出中包含路径(我不是这样,它不在下面的输出中)
$ brew info apache-spark
apache-spark: stable 2.3.2, HEAD
Engine for large-scale data processing
https://spark.apache.org/
Not installed
From: https://github.com/Homebrew/homebrew-core/blob/master/Formula/apache-spark.rb
==> Requirements
Required: java = 1.8 ✔
==> Options
--HEAD
Install HEAD version
==> Analytics
install: 4,534 (30 days), 14,340 (90 days), 56,698 (365 days)
install_on_request: 4,263 (30 days), 13,490 (90 days), 51,876 (365 days)
build_error: 0 (30 days)
你应该使用brew info apache-spark,如果brew install它将在输出中包含路径(我不是这样,它不在下面的输出中)
$ brew info apache-spark
apache-spark: stable 2.3.2, HEAD
Engine for large-scale data processing
https://spark.apache.org/
Not installed
From: https://github.com/Homebrew/homebrew-core/blob/master/Formula/apache-spark.rb
==> Requirements
Required: java = 1.8 ✔
==> Options
--HEAD
Install HEAD version
==> Analytics
install: 4,534 (30 days), 14,340 (90 days), 56,698 (365 days)
install_on_request: 4,263 (30 days), 13,490 (90 days), 51,876 (365 days)
build_error: 0 (30 days)
版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。