本文属于机器翻译版本。若本译文内容与英语原文存在差异,则一律以英文原文为准。
教程:将 REPL shell 与开发终端节点结合使用
在 Amazon Glue 中,您可以创建一个开发终端节点,然后调用 REPL(读取 – 评估 – 打印循环)shell,以增量方式运行 PySpark 代码,从而在部署 ETL 脚本前交互式地调试它们。
要在开发端点上使用 REPL,您需要获得 SSH 到该端点的授权。
-
在您的本地计算机上,打开一个可以运行 SSH 命令的终端窗口,粘贴编辑的 SSH 命令。运行命令。
假设您已接受使用 Python 3 的 Amazon Glue 版本 1.0 作为开发端点,则输出将如下所示:
Python 3.6.8 (default, Aug 2 2019, 17:42:44) [GCC 4.8.5 20150623 (Red Hat 4.8.5-28)] on linux Type "help", "copyright", "credits" or "license" for more information. SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/share/aws/glue/etl/jars/glue-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/lib/spark/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 2019-09-23 22:12:23,071 WARN [Thread-5] yarn.Client (Logging.scala:logWarning(66)) - Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 2019-09-23 22:12:26,562 WARN [Thread-5] yarn.Client (Logging.scala:logWarning(66)) - Same name resource file:/usr/lib/spark/python/lib/pyspark.zip added multiple times to distributed cache 2019-09-23 22:12:26,580 WARN [Thread-5] yarn.Client (Logging.scala:logWarning(66)) - Same path resource file:///usr/share/aws/glue/etl/python/PyGlue.zip added multiple times to distributed cache. 2019-09-23 22:12:26,581 WARN [Thread-5] yarn.Client (Logging.scala:logWarning(66)) - Same path resource file:///usr/lib/spark/python/lib/py4j-src.zip added multiple times to distributed cache. 2019-09-23 22:12:26,581 WARN [Thread-5] yarn.Client (Logging.scala:logWarning(66)) - Same path resource file:///usr/share/aws/glue/libs/pyspark.zip added multiple times to distributed cache. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /__ / .__/\_,_/_/ /_/\_\ version 2.4.3 /_/ Using Python version 3.6.8 (default, Aug 2 2019 17:42:44) SparkSession available as 'spark'. >>>
通过键入语句
print(spark.version)
测试 REPL shell 是否正常工作。只要显示 Spark 版本,就表明您的 REPL 现在可供使用。-
现在,您可以尝试在 shell 中逐行执行以下简单的脚本:
import sys from pyspark.context import SparkContext from awsglue.context import GlueContext from awsglue.transforms import * glueContext = GlueContext(SparkContext.getOrCreate()) persons_DyF = glueContext.create_dynamic_frame.from_catalog(database="legislators", table_name="persons_json") print ("Count: ", persons_DyF.count()) persons_DyF.printSchema()