1、设置maven内存
export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"
2、配置配置Maven到OSChina中央库。
首先,感谢OSChina为我们提供了国内的 Maven 中央库,免去了偶尔无法连接国外Maven库蛋疼的问题,小弟首先在此表示感谢。
OSChina为了方便广大开发同学,特别推出国内的 Maven 中央库,提供高速稳定的网络和服务,为国内 Maven 使用者提供便捷服务。
在 Maven 中使用 的 Maven 服务还需要简单配置一下 Maven,在 Maven 的安装目录下的 conf 文件下有个 settings.xml 文件,接下来我们需要对这个文件做简单的修改,修改前您可以简单备份下该文件。 打开 settings.xml 文件,按下面内容修改。
nexus-osc * Nexus osc http://maven.oschina.net/content/groups/public/
补充: 如果还需要osc的thirdparty仓库或多个仓库,需要如下修改:
nexus-osc central Nexus osc http://maven.oschina.net/content/groups/public/ nexus-osc-thirdparty thirdparty Nexus osc thirdparty http://maven.oschina.net/content/repositories/thirdparty/
这里是配置 Maven 的 mirror 地址指向 的 Maven 镜像地址。 在执行 Maven 命令的时候, Maven 还需要安装一些插件包,这些插件包的下载地址也让其指向 OSChina 的 Maven 地址。修改如下内容。
nexus local private nexus http://maven.oschina.net/content/groups/public/ true false nexus local private nexus http://maven.oschina.net/content/groups/public/ true false
3、编译spark源码
./make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn
编译成功的日志:
[INFO] [INFO] --- maven-source-plugin:2.4:jar-no-fork (create-source-jar) @ spark-examples_2.10 ---[INFO] Building jar: /home/hadoop/Downloads/spark-1.5.0-cdh5.6.0/examples/target/spark-examples_2.10-1.5.0-cdh5.6.0-sources.jar[INFO] [INFO] --- maven-source-plugin:2.4:test-jar-no-fork (create-source-jar) @ spark-examples_2.10 ---[INFO] Building jar: /home/hadoop/Downloads/spark-1.5.0-cdh5.6.0/examples/target/spark-examples_2.10-1.5.0-cdh5.6.0-test-sources.jar[INFO] ------------------------------------------------------------------------[INFO] Reactor Summary:[INFO] [INFO] Spark Project Parent POM ........................... SUCCESS [ 7.572 s][INFO] Spark Project Launcher ............................. SUCCESS [ 22.126 s][INFO] Spark Project Networking ........................... SUCCESS [ 12.666 s][INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 7.181 s][INFO] Spark Project Unsafe ............................... SUCCESS [ 22.535 s][INFO] Spark Project Core ................................. SUCCESS [08:46 min][INFO] Spark Project Bagel ................................ SUCCESS [01:27 min][INFO] Spark Project GraphX ............................... SUCCESS [01:46 min][INFO] Spark Project Streaming ............................ SUCCESS [02:46 min][INFO] Spark Project Catalyst ............................. SUCCESS [03:39 min][INFO] Spark Project SQL .................................. SUCCESS [07:24 min][INFO] Spark Project ML Library ........................... SUCCESS [07:47 min][INFO] Spark Project Tools ................................ SUCCESS [03:05 min][INFO] Spark Project Hive ................................. SUCCESS [10:07 min][INFO] Spark Project External Flume Sink .................. SUCCESS [01:57 min][INFO] Spark Project External Flume ....................... SUCCESS [01:52 min][INFO] Spark Project External Kafka ....................... SUCCESS [01:23 min][INFO] Spark Project REPL ................................. SUCCESS [05:10 min][INFO] Spark Project YARN Shuffle Service ................. SUCCESS [01:14 min][INFO] Spark Project YARN ................................. SUCCESS [08:07 min][INFO] Spark Project Hive Thrift Server ................... SUCCESS [05:37 min][INFO] Spark Project Assembly ............................. SUCCESS [05:09 min][INFO] Spark Project External Twitter ..................... SUCCESS [07:22 min][INFO] Spark Project External MQTT ........................ SUCCESS [07:06 min][INFO] Spark Project External MQTT Assembly ............... SUCCESS [ 32.661 s][INFO] Spark Project External ZeroMQ ...................... SUCCESS [06:55 min][INFO] Spark Project Examples ............................. SUCCESS [14:41 min][INFO] ------------------------------------------------------------------------[INFO] BUILD SUCCESS[INFO] ------------------------------------------------------------------------[INFO] Total time: 01:55 h[INFO] Finished at: 2016-03-20T11:17:44+08:00[INFO] Final Memory: 110M/1360M[INFO] ------------------------------------------------------------------------