大家好!
我試圖通過類型安全火花提交任務配置文件和打印信息的配置文件。
代碼:
進口org.slf4j。{記錄器,LoggerFactory} com.typesafe.config進口。{配置,ConfigFactory} org.apache.spark.sql進口。SparkSession對象引導延伸MyLogging {val火花:SparkSession = SparkSession.builder.enableHiveSupport () .getOrCreate () val配置:配置= ConfigFactory.load (application.conf) def主要(args:數組[String]):單位= {val url:字符串= config.getString (db.url) val用戶:字符串= config.getString (db.user) println (url) println(用戶)}}
應用程序。配置文件:
db {url = " jdbc: postgresql: / / localhost: 5432 /測試”用戶=“測試”}
我已經上傳文件到dbfs和使用路徑來創建工作。
json:火花提交工作
{" new_cluster ": {“spark_version”:“6.4.x-esr-scala2.11”、“azure_attributes”:{“可用性”:“ON_DEMAND_AZURE”、“first_on_demand”: 1、“spot_bid_max_price”: 1},“node_type_id”:“Standard_DS3_v2”、“enable_elastic_disk”:真的,“num_workers”: 1},“spark_submit_task”:{“參數”:[”——階級”,“引導”,“——相依”,“spark.driver。extraClassPath = dbfs: / tmp /”、“——conf”、“spark.executor。extraClassPath = dbfs: / tmp /”、“文件”、“dbfs: / tmp /應用程序。配置”、“dbfs: / tmp / code-assembly-0.1.0。jar“}”email_notifications”:{},“名稱”:“application-conf-test”、“max_concurrent_runs”: 1}
我上麵已經使用json創建火花提交作業並試圖運行spark-submit工作使用datbricks CLI命令。
錯誤:
異常的線程com.typesafe.config“主要”。配置Exception$Missing: No configuration setting found for key 'db' at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:124) at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:147) at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:159) at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:164) at com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:206) at Bootstrap$.main(Test.scala:16) at Bootstrap.main(Test.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
我可以看到在文件日誌,但下麵行不加載。
SparkContext 21/09/22 07:21:43信息:添加文件dbfs: / tmp /應用程序。參看dbfs: / tmp /應用程序。配置時間戳1632295303654 21/09/22 07:21:43信息跑龍套:抓取dbfs: / tmp /應用程序。conf / local_disk0 /火花- 20456 b30 fddd - 42 d7 - 9 - b23 9 e4c0d3c91cd / userfiles ee199161 - 6 - f48 4 c47 b1c7 - 763 ce7c0895f / fetchFileTemp4713981355306806616.tmp
請幫我在這類型安全配置文件傳遞給spark-submit工作使用適當的火花提交的工作參數。
嗨@Praveen Kumar巴楚
有幾個局限性spark-submit任務:
請檢查文檔的更多信息https://docs.m.eheci.com/dev-tools/api/latest/jobs.html jobssparksubmittask
嗨@Praveen Kumar巴楚
有幾個局限性spark-submit任務:
請檢查文檔的更多信息https://docs.m.eheci.com/dev-tools/api/latest/jobs.html jobssparksubmittask